Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 5 Next »


Meeting Details

Meeting Date:

Purpose:

Data Warehouse Advisory Group

Zoom Recording:

https://zoom.us/rec/share/jk5xSVwxh6TPBs8zj2sTx0FF9KvwDkNYCuOtWcK2B6A_nJf_Ao-8xxFYrvYLNqJ9.oug34GgC9CLO9tBX

Passcode: af6Kw#P*

Participants:

Mark Cohen, Crystal Hernandez, Alexander Jackl, Manos Stefanakos, Bridget, Craig Hayward, Dulce Delgado, Jenni Allen, Virginia Moran, Z Reisz, Denice Inciong, David Kendall, Dustin Tamashiro,

Agenda

Item

Notes

1

CCC Data update

Data Warehouse Report Server

  • 95 colleges live with Data Warehouse report server

  • Outreach continues to bring remaining colleges onboard

CCC Data 2.0.0

  • Provides direct access to the Data Warehouse

  • Deployed to Pilot 10/2/20

  • 6 Pilot Participants

    • Team/Enabling Services working w Phase 1 Pilot Colleges to setup VPN  site to site connection so that the pilot testing can commence

    • 1 college live

    • engagement started w Phase 2 Pilot Colleges

  • Production Nov

CCC Data 2.1.0

New data sources in DW segmented by MIS code & made available to CCC’s through DW Report Server (DWRS) and direct  DW access: Canvas (as opted-in to by individual colleges), MyPath, COCI, C-ID

New data sources in DL & made available to CCCCO: Nova, Launchboard, DWRS CVC-OEI enrollment report 

  • Development work has begun

  • Pilot Q2

  • Production Q3

2

Update on status of vision for success and student equity measurements metrics

Student Equity and Achievement/Vision for Success Metrics

  • Presentation provided by Manos Stefanakos

  • Addresses data change management issues and the connections to the metrics and that aspect of the data change management

  1. Background: Student Equity and Achievement (SEA)

  2. SEA Reporting Requirements

    1. Executive Summary (Includes goals identified, activities to achieve those goals)

    2. Budget Information

  3. SEA Sample Report from NOVA

    1. Set of demographic groups

    2. Four categories of metrics they can select from (Access, Readiness, Retention,

    3. Can select what is the goal, what is the gap seeking to close

    4. Report is in table format, comes out of NOVA where reporting is done

    5. Accompanied by planning information that is free form text fields, to support report table

  4. What are the SEA Metrics

    1. Colleges have to select at least 5 different groups of traditionally underrepresented students and select a metric for each (from Student Success Metrics list) in the following categories: Access, Readiness, Retention,
      Completion

  5. What are the SEA Metics - Examples

    1. Percentage of students who enroll after applying (Access, SEA)

    2. In parenthesis you can see SEA metrics

    3. Analogous component that comes from the other two sets of metrics being evaluated

  6. Data Lake/Data Warehouse Inclusion

    1. The NOVA annual reporting includes both metrics as well as planning and activity information that is provided in the form of text

    2. The metrics are generally student counts/ratios, often disaggregated by subgroups of students, e.g., by race, gender, LGBTQ, veteran, and economically disadvantaged status

    3. The numbers behind these already exist in MIS data, as student counts and demographic flags on individual student records

    4. The text fields could be brought in to provide additional insight

  7. Suggestions for SEA Metrics

    1. While, ideally, one would be able to recreate the NOVA report using MIS data, because of timing, and other reasons, it is suggested that the NOVA SEA report be brought in, as it might be impossible to recreate the numbers from existing MIS data.

Questions/Concerns

  1. Virginia Moran: Concern. “In the past SCA was pre populated with MIS data and then we would just respond. So, but in the future. You're saying enable us to change what was reported in state MIS?”

    1. Manos Stefanakos: “It looked like in NOVA you were able to make changes. I could be wrong about that.”

    2. Elaine Kuo: “we just input the data we receive the data file. We then and then we enter it into Nova. There was no additional manipulation, there will be some drop downs that would occur, but I don't remember us doing any manipulation”

    3. Virginia Moran: “Elaine when you entered in like your goals, but the actual baseline data. Wasn't that pre populated?”

    4. Elaine Kuo: “Yeah, there was no manipulation and then depending on what boxes you collect check that would also create other drop downs was my recollection, but there was no additional sort of like analysis of the data or inputting of the data.”

    5. Manos Stefanakos: “Okay, and I apologize that might have to do with the fact that I don't actually have an account to be able to do that work. So we're only able to see from help and other things, how that looks and so, okay”

    6. Craig Hayward: “That I can think of that may go to this, which is, I think it was possible to add a group that was not included in the derived in the data that was sent to the colleges so that may be the piece that you know can't be replicated or picked up kind of automatically from MIS”

    7. Virginia Moran: “So I agree, those kinds of things would be helpful for those of us who don't have a good handle on foster kids, for example, or who have issues with how economically disadvantaged is collected on the campus to kind of clean some of those things up. But in the future. I, I guess I hesitate to allow colleges to submit anything different from what's already pre populated via state MIS, I would prefer they clean up state MIS”

    8. Manos Stefanakos: “Absolutely agree. So if that's the case, then it sort of alters my suggestion. If it is based on completely on MIS data. Um, I don't know that we need to sort of re-import it as it comes out of Nova. The only suggestion that I would have is that it's if possible that a similar view as to that reporting table be available maybe as part of the reporting or in a table that has saved somewhere, either in Data Lake or Data Warehouse. Mark and Steve I'll let you guys really respond to that let me know what you think.”

  2. Manos Stefanakos: “Mark, Steve. Any thoughts on maybe saving sort of the, the analogous data that would sort of populate this report other than Jasper reports or someplace else?”

    1. Mark Cohen: “I mean it sounds. It sounds like it would it would make you know it would make sense to add this as a data source that will be available, you know, trough the Jasper reporting and direct access. You know I know when we talked about it before it was, you know, there was a question as to whether this was a one off or or ongoing it how frequently updated are the data sources that make up the report?”

      1. Virginia Moran: “I believe its annual because there's most of the indicators are lagging indicators. The only thing that we do locally, at least for VVC is term the persistence one. But we are required report annually on this thing and I don't think they updated any more frequently than that, honestly, they can't even the term data. Yeah.”

      2. This is something that should be on Data Source prioritization list

  3. Dulce Delgadillo (She/her): “Are we saying that this data set would only have desegregation for metrics that are associated with student equity? Versus all of the other initiatives so SSM and we look at adult ed because some of those are also presented in a desegregated method. So I'm just a little curious on you know if the overlap. And when we say looking at SEA metrics are we saying that we are going to produce that the database would be able to produce a report that would list all of the SEA metrics with the desegregation that is intended that the state has told us to look at, because that also shifts right?”

3

Change Management; to include both an update on change management efforts at the CO and discussion on how we can best support changes to metrics and data elements. 

  • Presentation/Discuss David Kendall

  • Address Change Management as process/artifact

4

Discussion on concept of enabling districts to share data (if time allows) 

5

00:18:34 Dulce Delgadillo (She/her): That was our experience too at NOCE
00:32:53 Craig Hayward: Many of the current SEA metrics are unusual in the way they are calculated.
00:33:14 Craig Hayward: As a field, we are waiting for cohort-based SEA metrics to drop.
00:45:41 Virginia Moran: @David can you please speak to maybe starting assumptions/core principles as well?
00:47:18 David Kendall: Yes, thank you Virginia
00:50:48 Dustin Tamashiro: I just want to say what you're doing is really important and really appreciated.
00:55:47 Virginia Moran: Agreed, Dustin.
00:56:24 Valerie Lundy-Wagner: Isn't this a legal issue?
00:57:16 Bridget (she/her/hers) Herrin: Do you mean under FERPA? If its being used for legitimate academic purposes it should be covered
00:58:30 Elaine Kuo: How much coordination is there between the data warehouse work and the launchboard efforts? I recognize that is part of our continuing conversation. Should there be representation from the launchboard side on this task force (especially when these data elements are discussed)?
00:58:36 Valerie Lundy-Wagner: I think that sharing data should be done only when needed/necessary. The "legitimate" academic purposes has not been articulated to me just yet, but I understand it is possible.
00:58:51 Valerie Lundy-Wagner: The question was posed as 'of interest' and it should be focused on the problem we're trying to solve.
01:00:06 Valerie Lundy-Wagner: Thanks Elaine, yes, there does need to be a bit more conversation between these meetings and the Launchboard conversation. I don't know if John Hetts was invited or just couldn't make this time, but it seems appropriate for he and I to spend a bit more time with Mark and team to make sure the Research and Data Analytics conversations aligns with the MIS-type conversation.
01:00:07 Dulce Delgadillo (She/her): Completely agree with Denice, there's a lot of leg work to do on the ground to just get to a point of sharing, so to leverage this data set would free up those resources
01:00:28 Alexander Jackl: @Elaine It is part of the data harmonization efforts to coordinate Lanchbaord and MIS and DW so we will be workig o nthat and continuing to work on that
01:02:14 Denice Inciong: Thank you, this is hard but needed work. :)
01:02:34 Elaine Kuo: Appreciate all the hard work on this effort.
01:04:27 Jake Kevari: Thank you!

Issues/Questions Resolved

Issue/Question

Resolution/Answer

Date Resolved/Answered

Owner

1

Issues/Questions Needing Resolution

Issue/Question

Resolution/Answer

Date Resolved/Answered

Owner

1

What do analytics indicate about CO MIS usage with regard to the data mart?

Alex Jackl will work w Todd Hoig to get answer to this question, will share response w Mark and Advisory group.

Per Alex, CO has been running some page web page analytics against people accessing data mart and so we might have some data from that.

Alex will share responses w Mark so that he can share with Advisory group.

Alex Jackl

2

Should vision for success and student equity measurements metrics to be included

Manos Stefanakos will look into it and report back to the group on whether it's something that will be published again, or what the formula is in terms of how it is produced

Manos Stefanakos

3

Jake Kevari:

Discussion with possibility of sharing data from the data warehouse shared with other colleges or other districts

Would this be across aggregates or at local level?

Will add as a agenda item to an upcoming meeting.

Mark Cohen

Action Items/Next Steps

Item

Notes

Owner

  • In addition to documentation, look into creating a Webinar for the group

  • Manos Stefanakos will share out Student Equity and Achievement/Vision for Success Metrics presentation to the group

  • No labels