A Date with Data
A Friend Indeed: Keeping Things on Track in Indiana
October 11, 2023
One thing we appreciate: A good friend who has our back, keeps us on the right track. In the Indiana Department of Education, that’s Angela Vaughn, Assistant Director of Monitoring and Compliance. In that role, Angela helps oversee her state’s annual LEA determination process, and, in her role, host Amy Bitterman is eager to learn more. Join us for this week’s A Date with Data where Angela describes how she and her office help Indiana LEAs use data elements and apply criteria to the determination categories to satisfy this important, and mandatory, process.
Resources: Indiana Department of Education, Special Education Results Accountability website -  DOE: Special Education (in.gov)

Reach out to us if you want to access Podcast resources, submit questions related to episodes, or share ideas for future topics. We’d love to hear from you!

You can contact us via the Podcast page on the IDC website at https://ideadata.org/.

### Episode Transcript ###

00:00:01.61  >> You're listening to "A Date with Data" with your host, Amy Bitterman.

00:00:07.45  >> Hey, it's Amy, and I'm so excited to be hosting "A Date with Data." I'll be chatting with state and district special education stuff who, just like you, are dealing with IDEA data every year.

00:00:19.60  >> "A Date with Data" is brought to you by the IDEA Data Center.

00:00:25.01  >> Hello. I am Amy Bitterman, and this is IDC's "A Date with Data." On this episode, I have with me Angela Vaughn, who is the assistant director of Monitoring & Compliance from the Indiana Department of Education's office of special education, and she's going to be talking to us about Indiana's annual LEA determination process. For those who aren't familiar, IDEA requires states to make determinations annually about the performance of each LEA. There is variability in terms of how states use the data elements in the determinations, how they calculate determinations and also how they're applying criteria for the different determination categories. And we're really excited to hear about how you're doing this in Indiana, so welcome to the podcast, Angela.

00:01:12.54  >> Well, it's great to be here. I've heard many wonderful podcasts before, so this is a little intimidating after seeing the great, incredible people and the work that's being done and being part of this, so thank you.

00:01:25.35  >> Of course. Do you want to start off just by telling us a little bit about yourself and your role?

00:01:30.54  >> Sure. As you mentioned, I'm the assistant director of Monitoring & Compliance, so I primarily oversee the monitoring of the indicators and then other monitoring that we do internally with LEAs, and then of course that includes the LEA determinations that we make annually, and so prior to that, I was a local director and prior to that was a special educator in the field, so I was definitely spent many, many years on the user end of this process, and so it's really exciting to have now come into the state side of it and seen some ways we can make improvements.

00:02:12.29  >> Great. I'm sure having that experience more in the field has really benefited you too to have that perspective and bringing it to the state level.

00:02:19.68  >> Yes, always think that's an important perspective to keep when we're doing this is the user end perspective, so it was a big help.

00:02:28.19  >> Yeah, especially with determinations because you used to kind of be on the end of receiving them, right, and now you're managing how they get made and sending those notifications out and that whole side of things, so very different experience, I would imagine.

00:02:41.76  >> Yes, definitely.

00:02:43.18  >> So do you want to maybe start off by telling us kind of how you are making determinations in Indiana and why you've decided to kind of do things the way you have, changes maybe that you've made to the process over time?

00:02:56.60  >> Yes, definitely, so we'll kind of start at the beginning. When I came into the role as assistant director, some of the priorities that the state had set out were around technology modernization or IT modernization, and so there was a lot of work being done in other areas of data collection for all LEAs, and so it was really important for us and our team that we were part of that. We had a system that involved a lot of Excel spreadsheets and documents back and forth and things getting lost in e-mail, and had a lot of tracking issues from things being sent to different e-mail addresses. And so we knew we had some areas that we wanted to focus on in our determination, but really one of the big push for this work was that we wanted to have special education be included in the same place that other data was going. So if data managers were submitting things, and schools were accessing their data within what in Indiana is called the LINK Initiative or the LINK Portal. We wanted special education to be there as well because I think overall that sends the message of inclusion, that we want schools to be inclusive, and if we're in a separate place, and they're searching for special education data separately, we really felt strongly that that wasn't the message we wanted to send, so that was part of the big push that we wanted to move along with that modernization initiatives that were happening.

00:04:34.65  >> Mm-hmm.

00:04:35.55  >> So with that then, not necessarily specifically for that same reason, but we started working with the National Center for Systemic Improvement, NCSI, and we were working through a process with them called leveraging our general supervision system.

00:04:53.43  >> Mm-hmm.

00:04:54.04  >> And so we started to really take a deep dive into the mapping of our procedures, and then that conversation also included our priorities, and really what were we saying to LEAs is important.

00:05:11.80  >> Mm-hmm.

00:05:12.11  >> And that definitely needs to come across in our annual determination. So that led to again not just the process of how you receive it and how you access the data to what was included in that determination process, and what information were they looking at?

00:05:31.47  >> Right.

00:05:31.71  >> So with that process, we did, we brought together a fairly large stakeholder group that was hopefully representative of not just LEAs but other organizations that we work with and again started to look at our priorities. What's our vision? What are we currently doing that speaks to that vision? And then maybe what things were being included that we were not hearing at this point were as important as maybe they were when that process was first developed.

00:06:05.43  >> Mm-hmm.

00:06:05.72  So we ended up removing several things, so for example, we heard loud and clear, graduation rate was important, and we wanted to keep that as a focus area.

00:06:20.64  >> Mm-hmm.

00:06:20.98  >> So one of the things we did was increase the weighting for the determination calculation related to graduation rate, but then one of the things we also heard was that our preschool outcomes data was not exactly where we wanted it to be related to our assessment outcomes and how schools were able to use that, and so we removed that component right now until we can get that fleshed out and reworked kind of what then was included and how things were weighted based on that stakeholder feedback and engagement.

00:06:55.97  >> Okay. Were there other elements that you added in that maybe you didn't have before kind of based on the stakeholder engagement or just the priorities of the department?

00:07:05.73  >> Yeah. At this point, we did not add as much as we changed weighting.

00:07:11.53  >> Hmm.

00:07:11.74  >> So we were very heavily also focused on least restrictive environment and an inclusion of school-age children, and so we had done really well, and we were at the higher ends of performance there, and again recognizing that we don't want to say that inclusion is the only placement, and recognizing that there is a continuum of placement. We reworked that weighting because we still want to say, "This is important, and it's a focus area, but we're doing well, so let's acknowledge that schools are typically doing well, and then focus more on" ... And where we shifted that was to growth, so our assessment growth was not weighted as highly as some of the other components, and so we wanted to say, "We're not so worried about your proficiency level as we want to focus on paying attention to growth of students in terms of the assessment versus being proficient," so while those things were included, we kind of de-emphasized and re-emphasized certain points there.

00:08:22.75  >> Mm-hmm. Yeah, that makes sense.

00:08:24.58  >> So from there, we were able to develop many more resources because we wanted to release this at the next annual determination time. We wanted people to really understand what it meant, where the data came from, and then resources. That if this becomes an area that you see from your determination you want to address, you've got in one place all of that information. And so for each of the items we included, we reworked some of our compliance and monitoring guides and included the determination information in there, again speaking to, how do you locally find this data? What report does it come from, and when is that report collected? When do we make those determination, that whole timeline process specific to what we call our results-driven accountability, which is our annual determination. And so along with that then, we also, since this was a new system, we had to develop resources about just accessing the system and things like, what role do you have to be assigned to in your district because it is a secure portal?

00:09:39.89  >> Yeah.

00:09:40.13  >> What roles do you have to have to access the data? Who can access the data? How do you and the district ... Who do you want to access the data and at what level? All of those kind of things had to be developed along with the actual development of the system. So we definitely wanted to be transparent, and we wanted to have resources available that people understood how to sue the information, and not that there weren't things before but just really, again, as I've said, re-emphasizing the importance of these things as we were kind of doing a grand reopening sort of of this work.

00:10:17.93  >> Mm-hmm.

00:10:18.17  >> So that also took some time, and all then was posted to our website prior to the opening of this dashboard.

00:10:27.82  >> And when did it first open?

00:10:30.36  >> So our annual determination is ... happens annually in November, so November of last year was our first release, and it surprisingly went really, really well. We had very few bugs. We did have a few but had very few, but just definitely really well received by the field. We had a lot of feedback that all of the things that were posted within the dashboard were also able to be downloaded into an Excel spreadsheet so directors who had access could then easily download the information to share with their staff, but they had the primary access to that information. So from a technology standpoint, things went really well.

00:11:21.00  >> Great.

00:11:21.17  >> And then the feedback from the field was really positive, and they appreciated I think also just having things in one place, that they weren't going somewhere else to get that information, and then always available to them. It's still there. We have last year's information. We on the back end are loading this year's data, but they'll be able to then continue and have that data longitudinally. We're also looking at loading previous or historical data. We haven't gotten to all of that, but we hope in some of our next iterations to be able to do some trend analysis for the districts as well.

00:12:03.94  >> Wow, it sounds like a really robust and useful system, so exciting. Are there other benefits that you've seen to this new system?

00:12:14.76  >> Yeah, I think it's also helped because the way we set it up is different pages, so previously when the LEAs received this determination, it was kind of one long running page.

00:12:26.51  >> Yep.

00:12:26.68  >> It'd be three or four pages long, but it ran, but in this system now when the staff open up their portal, their dashboard in the portal, they will see different pages.

00:12:38.05  >> Mm-hmm.

00:12:38.31  >> So the first page is actually the discrimination and kind of the summary, so they started off, they see where are we from our determination and our level of support ratings to them breaking that down, so we have our compliance page, which then speaks to their determination, and so that made it much clearer, I think, how exactly that determination is calculated. Then within the system, each of those also, if it was applicable, we had a little information button so if they wanted more information immediately about what that particular indicator was, they could open that up, and it would give them a little bit more information kind of in the moment.

00:13:23.85  >> Mm-hmm.

00:13:24.43  >> Then another page then was the results page, and so then we had our indicators that speak to results listed there.

00:13:33.33  >> Mm-hmm.

00:13:33.76  >> Then we had ... were able to also add what we called a non-scored data page, so it was data that we wanted to share, but it was not calculated into the determination, but it's things we want to keep on the radar and we want to keep having conversation about. So one example is our presence LRE was not included as a scored item, but we want to have a lot of conversation about that, and it may be in the future, but we want to get people to start looking at that data as well and know again that we think it's important, and so we want it to be included in some way. There were other indicators that weren't scored and included, but it was a way for us to acknowledge that, "Hey, we're looking at this. Hey, we want you to pay attention to this." And then there was another page that kind of summarized and gave some next steps, so one of the things we were also able to do was incorporate some of our corrective action requirements into the system, so it was kind of everything in one place. So we were able to set up some Jotforms where they were linked in that final kind of summary ... or instructions I think is more of a better way to say it ... page that if you had a finding related to initial evaluations, there was a Jotform there where we would start that corrective process there within that system, and then had it set up where then we could get that data directly. Again, it wasn't e-mailing things back and forth. It was in one spot, and on the back side then that they gave us another great way to collect data more specifically into the noncompliance of the individual LEAs to help them help and support and kind of see from a state level, again, "What resources are we needing to develop?" because of some of the analytics we could do with that data collection because prior, although we were collecting it and tracking it, it was kind of in so many different places, it was hard to make sure we were getting things directly. It might have to go through two or three different transitions to get to the right place, so think of kind of a one-stop shop that you're doing related to your determination and correction was all there.

00:15:56.61  >> Yeah, that's wonderful, and I think, yeah, that one-stop shop and having everything in one place makes such a huge difference I'm sure for you all at the state and then also the districts too, and probably has really helped to improve the quality and timeliness and accuracy of all of the data, I would imagine, instead of having things in different places and having to track things in different ways that can kind of lead to more of those data quality issues. If it's all in one place, it seems like it would be easier to ensure the quality.

00:16:30.47  >> Yes, you described that perfectly and really what our goal was, and then fortunately that's really what we saw comparatively to how we had been managing that before.

00:16:42.77  >> It sounds like you had a really good launch of the system. Were there any challenges that you have encountered though, and how have you addressed them?

00:16:53.32  >> Yeah, it's been a lot of collaboration. We did work with an external vendor in developing the actual dashboard itself. It is run by Power BI, and so we have closed out with that vendor, so this next round is where we are working internally with our IT department, and so this is again not right at our fingertips. We have to go through another department to make sure that data is loaded, and then let ... be loaded in time that we can do some quality checks on that, so while it's our project, and we're leading that project, so having to reach out and have communications with other departments that have lots of projects going on, our IT department is working with everyone really, so making sure that we're getting on their radar and staying on their radar because we have data coming in all the time, and so that will be ... We're in that process, and although right now it's going well, sometimes it's not as fast as we would like to see it go because we don't have the direct access to loading the data.

00:18:07.89  >> Yeah, that all makes sense and I'm sure is something very common in a lot of the work that you do in other states as well. It's kind of when you have a vendor, there's things that you're working through, but then similarly you would think, "Oh, if we bring it in-house, it'll be different. It may be easier, and we'll have more control," but then there are other hoops you have to jump through, of course, doing it that way, too. So that definitely makes sense, and so far that's great that it's been a pretty smooth launch and new system.

00:18:38.76  >> Yeah, we're definitely hopeful for the next round. Again, you don't know what will come up, but we're definitely trying to get ahead of things, get things loaded and reviewed as soon as possible as far as is reasonable, given when times for data collections and things like that are happening, but we're feeling good. Our IT department is really confident that things will go well, and so again given the success we had last year, I think it's also reassuring that people will ... this year, we won't have the new factor.

00:19:15.97  >> Mm-hmm.

00:19:16.47  >> Know where to go, they'll know how to log in, so a lot of those things that we struggled with last year, we're going into this year thinking we're not going to have as much of that kind of questioning or issues.

00:19:27.37  >> Mm-hmm. Yeah, there's always a learning curve, so hopefully, yeah, this year will be even easier. Do you have plans for the future in terms of maybe other changes or what you hope to do moving forward with the system and determinations?

00:19:42.42  >> Yes, definitely. I mentioned we want to load previous data, and we're having some discussions about that, but also then as we're moving forward, one of the main things we've had discussion around is, right now this is district-level data, and so we want to move to school and student-level data. Although it's the school's data, what we find especially for our special education directors is that they aren't always the ones that have access to that data directly, and so we feel like if this is something we could provide for them, it just helps them be able to better use their data, so the assessment data, while that comes in and the school has that, sometimes that's ... in our state, the testing coordinator would have that data and not necessarily the special education director, and so helping to drill down for LEAs that may not have that available to them directly, that is something we've heard back would be a next step, potential next step.

00:20:59.30  >> Great. Is there any ... I know this is all not information that the public can access since it's the district's data, but is there anything on your website that if other states are interested in just getting a sense or a taste of what this is like, there's something that they might be able to look at? I don't know if you have screenshots or ...

00:21:21.09  >> We do, yes, yeah.

00:21:22.84  >> Great.

00:21:23.15  >> So on our website, we do have a tab that is called Results-Driven Accountability, so that specifically, we wanted it direct so that the field could get to those resources directly, and there is a user guide at that location, and so that, like you said, it has screenshots. It has actual information about what is included, and then there are other resources that clearly define how we're weighting those measures. What weight ... what measures are included? How are we weighting that? Exactly how is that calculation met? There are certain things that ... our score, we use typically a one through five scoring, and so how we assign those scores to different districts. We do use an enrollment size. We have such a wide variety of schools in our state from very small rural schools to large urban schools, so we have some different calculations that we use based on your enrollment size. So we're hoping that measures apples to apples ...

00:22:34.20  >> Right.

00:22:35.32  >> ... because what we were hearing from people is that we're just so different that we really shouldn't be compared to each other in certain ways, so that was something that really originally in the original determination process was determined, but we have continued that.

00:22:55.11  >> Okay. Well, we'll definitely put a link in the description of the podcast episode for anyone listening that wants to check out and see more of what Angela's been describing to us. I'm sure people will.

00:23:07.61  >> That would be great, and we'd be happy to answer any questions or share any more information that we can about what we went through.

00:23:16.12  >> Great, well, thank you so much for talking about your amazing, robust process. We really, really appreciate it, and thank you for being on.

00:23:27.27  >> Well, happy to be here and definitely appreciate the opportunity.

00:23:32.89  >> To access podcast resources, submit questions related to today's episode or if you have ideas for future topics, we'd love to hear from you. The links are in the episode content or connect with us via the podcast page on the IDC website at ideadata.org.