GT Institute Review Committee

Meeting Minutes, April 15, 2002

College of Sciences Conference Room, Tech Tower

 

A meeting was held in the College of Sciences Conference Room on April 15, 2002, at 10:10 a.m. Members present included Jim McClellan, Joe Hughes (liaison from the UCC), Paul Wine, John McIntyre, Andy Peterson, Joseph Hoey, Farrokh Mistree, Ron Arkin, and Russell Gentry.

 

Presentations were scheduled from various individuals who have participated in the program review process. The focus of the presentation were

- How you did your program reviews?

- What are the major findings?

- What are your comments on the process?

 

I. Dr. Andy Smith, Chemistry & Bio-chemistry Program Review Process

A. Program Review.

The review was done in October before the guidelines were put together. They have been doing the program review, and now it has now come full circle.

 

B. External Visiting Committee

The Committee was made up of a distinguished group.

The Committee was given access to the following:

·         faculty vitas

·         strategic plan

·         outcome assessment reports for undergraduate and graduate programs for the last two years

·         meetings with faculty, graduate students, and staff

·         lunch with chairs of other departments in the College

·         met with the Provost

·         met with Gary, Kent, and Andy and with Mike Thomas at the de-brief. Joseph was in on the de-briefing and asked questions.

 

A good deal of their time was spent in parallel meetings.

 

It was a positive review. The Committee looked at the structure of the department in the context of Georgia Tech and how effective it is, the improvements it has made over the last eight years, and how the School has improved in the rankings.

 

Recommendations included the following items:

·         hiring in the mid-age range level

·         hiring a new chair when Thomas Moran retires

·         hiring a school administrator to see that the day-to-day affairs are handled better

·         committing to facilities and faculty hiring by Georgia Tech

·         testing entering Chemistry grad students, which would reduce the workload on teaching assistants.

·         benchmarking the workload for Chemistry TAs to other school’s TAs.

 

B. What can be done to smooth out the BOR review process?

·         Keep the School’s flexibility with the reviews.

·         The School asked very specific things that they wanted the reviewers to look into, which would not be included in the template.

 

C. Do you have any suggestions for solutions to the documentation burden imposed by the program review process?

·         Dr. Smith was interested in the current strategic plan, setting up the agenda for the next five years, and getting buy-in from the faculty. The Committee’s interest was on the strategic plan.

·         The Department hosted the Committee. They sent their report to the Dean’s committee, but it was in general seen as being hosted by the Department.

 

D. Has the College talked about the process of selecting people who are on the visiting committee?

·         Putting together a committee to come every five years is not an easy task.

·         They ask for faculty suggestions.

·         They get advice from the Chair.

·         They want a distinguished person to serve on the Committee.

·         They like to make sure one person is from the University System of Georgia. Recently a Chair from UGA served on the Committee, and he could answer questions about the system by other Committee members.

·         They look more at the individual than from which school the individual is from.

 

E. Remarks about the review process?

·         It added some workload.

·         The external review committee would have happened anyway and be used as evidence for SACS. If anything, it made the process more systematic.

·         The Department’s primary interest is in research and trying to promote a national image. This is what the external review is about—what the deans want. The emphasis is on productivity not on assessment.

·         They could have someone from the IGC and the UCC sit in on the exit interview and ask questions of the Committee.

 

II. Dr. R. Gary Parker, Industrial & Systems Engineering Program Review Process

The External Review Committee will visit on April 29 through May 1.

A. Template.

·         They tried to follow the template, but there were some things that didn’t make sense.

·         Going through ABET helped; took information from ABET and SACS.

·         The size of the document was too large.

·         It took a good one-half to two-thirds of a month.

·         Looked at the document as a PR device, an honest compilation of who they are and what they are

 

B. Process

The process preparation began with Dr. Parker, who serves as director of all academic programs. He prepared the documents and gave the first draft to the School Chair, Bill Rouse. After getting the Chair’s feedback, he refined the document and gave it back to Bill Rouse. He then gave the document to the internal school faculty advisory committee and has had no feedback from them.

 

C. Findings

The process provided no major findings, just corroborated some of the things they already knew

·         class sizes are too large

·         advising—resources are low, but students are happy with advisors in terms of quality but not in terms of timing. They have to wait to see an advisor.

·         Transfer numbers are huge—this is an area of concern.

·         Grad enrollments (master’s increased from 120 in 1997 to 240 currently; Ph.D. increased from 90 in 1997 to 180 currently). Ph.D. and undergraduate enrollments are going to be stabilized. Master’s enrollment is problematic—the class sizes are huge. In four year’s time, this growth is going to hurt them. Master’s is all coursework, not research based.

D. External Review Committee

They are inclined to look for someone to serve on the committee involved in operations research. One reviewer will be from industry as suggested by Bill Reynolds. What they hope to learn from the Committee is a corroboration that this is a highly regarded program and is worthy of its reputation. They already know their weaknesses.

 

E. Value Received.

Use the Review Committee to relay information they want to the Dean such as the following items:

·      look at large class sizes

·      model use of TAs

·      look at sub disciplines they tried to staff

F. Recommendations to Improve Process/Documentation

·          Prune the document down—the volume of the document was a shock.

·          Prioritize some of the questions on the document that must be answered and then give the option of adding on other questions.

·         Dr. Parkers would be willing to share his document with other units.

 

III. Dr. Nate Bennett, Management Program Review Process

A. Process

Management had several committees and groups in place already, including the following:

·         IT Committee

·         Strategic Planning Committee

·         Undergraduate Curriculum Committee

·         Graduate Curriculum Committee

·         Undergraduate Office

·         Graduate Office

 

He brought the groups together by mid-term in the fall. They spent most of November writing. He had drafts at the beginning of spring semester. The External Review Committee visited March 13–15.

 

B. Findings

The External Review Committee report was favorable. The tone was “make the most of the opportunities of the new building.”

 

Mission—Their mission says they are preparing leaders in technological environments, which is difficult to operationalize. They are changing as the market is changing from .coms to investment bankers to just wanting a job. The mission will stabilize over time and attract a broad range of students.

 

Curriculum—The report did not point to unrecognized deficiencies:

·         semester conversion didn’t go perfectly

·         international curriculum

·         graduate level does not teach ethics

·         help students to develop a better appreciation of technology

 

Faculty—The report recognized the aggressive growth plans, the fact they thought about strategic planning, met with faculty and students and found they were pleased.

 

Facilities—The report had positive remarks about the new facilities.

 

Research—The report identifies that most faculty are engaged in quality research, although some not as productive as others.

 

Assessment—The report recommended that the College participate in the AACSB/EBI benchmarking assessment in which the College had elected not to participate.

 

Overall Recommendations—

·         change the MSM to an MBA, which is up for approval at the Board of Regents.

·         grow the program

·         grow the Ph.D. program as the faculty grows

·         develop strategy in the community for the new facility to attract students

 

C. Comments on the Process

Note that the documentation was completed prior to the development of the template. The template was more cumbersome to use.

 

D. External Review Process

·         Prefer reviewers where technology is a big component.

·         People involved and brought in are involved in the accrediting process. Specifically looked for AASCB panel of deans, where appropriate.

·         Management pre-mailed their report to the review committee.

·         Management set up a workroom for the reviewers that contained all of the documentation, including faculty CVs, syllabi, catalogs, program flyers, AASCB reports, etc.

·         The review team was very sensitive about what they put in the report; no negative information about faculty members in particular. The external reports usually come out after they have been to the Dean’s office.

 

E. Benefits of the Process

·         Not anything different from another assessment review.

·         It was quite similar to what AASCB identified and what they identified themselves.

·         Every degree program offered; every non-degree program is talked about—quite broad.

·         Worried about students, student quality

 

F. Length of Time Spent on the Process

It was hard to estimate time, three weeks to a month (200 hours).

 

G. Value

Didn’t really see value. It was more of an obligation. If they didn’t have the AASCB review, maybe it would have been different. The value is in the visit by the review team—three bright guys from good schools. The interactive exchange was helpful. The report was onerous. If he had to change anything, he would have increased the amount of interaction time with the review team.

 

H. Other

·         Management could prepare BOR report in the year immediately after the accreditation cycle, using the same report and updating it for the BOR.

·         If Management did not have to pull the documentation together for the BOR, they would give the visitors the AASCB review. They would still use faculty and deans from other schools as consultants.

·         Dr. Bennett said that he would help the committee compare what is different from what is required by the accrediting board from what is required on the template. He will help them in prioritizing the template.

 

IV. Dr. Ellen Dunham-Jones, Director, and George Johnson, Associate Director, Architecture Program Review Process

A. Review Process

Substantial review is focused on the curriculum. Areas reviewed include:

·         Assessment

·         Equity

·         HR

·         Physical Resources

 


B. Visiting Team

The visiting team looked at student performance criteria (37) that is mapped against every course in the program and that is color coded as follows:

·         students must be able to

·         students must be aware of

·         students must have an understanding of and have been tested on

Members of the team include the following:

·         practitioners

·         regulators

·         students (from the AIS [student wing of IAA] makes certain student’s voice is heard; students help write the report)

 

A team room is set up for the visitors, where they can review records, including the following:

·         exhibitions of high-pass work and low-pass work (graduate and undergraduate)

·         notebooks on every course

·         performance criteria

·         faculty publications over the last five years (since the last visit)

·         assessment (using feedback to improve work over semester)

 

Crosswalk—complete set of documentation that makes the AAB process and the BOR process a greater mesh.

 

The visiting team gave them a chance to show off. They rank fifth in the country but are not well known.  The visit also gave them a change to learn how other programs are running.

 

C. Problems

·         Lack of consistency in data from the institute. Some of the data received was for the College of Architecture and some for the School of Architecture. The modeling of data has to be changed or College of Architecture needs to be viewed collectively. There are 800 students in the College of Architecture with 400 of them in the School of Architecture.

·         Can’t tell if resources are covering all programs in the College of Architecture.

·         Received the data late—one to one and half weeks before the report was due.

·         Difficulty in tracking graduates; they don’t have the tools.

·         They way data are collected; there are no procedures. There should be a handbook for chairs.

 

D. Template

·         Have non-negotiable priorities that have to be completed.

·         The template asks an awful lot of questions.

·         Some questions have so many parts that it becomes huge. For example, “describe assessment” becomes seven pages.

·         Need to translate into English some of the jargon used in the questions or have parenthetical statements.

·         Would be willing to use template if Assessment could put it in a precise way.

 

E. Other Remarks

·         The next accreditation from AAB will be in six years if they get the highest marks.

·         George Johnson said he would be open to help the IRC see what portions of the review made sense.

·         Accrediting Board is not reviewing core curriculum; they are depending on Georgia Tech to do that.

·         They did not have a separate visiting than the accrediting board.

 


V. Dr. Richard LeBlanc, College of Computing Program Review Process

A. Process

The self-study was completed before the template was made available. Sections of the self-study were completed by different individuals and the information was compiled by Dr. LeBlanc and his assistant.

 

B. Review Committee

The Committee was rather large, made up of seven. Members are or had been chairs of Computer Science at other universities and involved in professional organizations.

 

The visiting committee had the following information in order to conduct its review.

·         List of questions developed by COC geared toward their strategic planning

·         Institute-wide data (not focusing just on the program) for comparisons

·         Executive summary outlining major issues and providing supporting data (10 pages) and strategic plan

·         Road map for each question, indicating where to look up information applicable to each question

 

The focus of the questions was to find out what the program needs to do to be in the top five and to find out if there are any obvious problems or issues.

 

The visit schedule included the following:

·         Broke down into groups of two and three.

·         Ran three tracks for most of the day.

·         Met with COC faculty, staff, and students.

·         Met with external deans and associate deans, e.g., GTRI who represent interactions with COC at the university.

·         Met with members of the COC Advisory Board Alumni Council.

·         Conducted late afternoon strategy direction sessions.

 

B. ABET

·         The next ABET review will be in 2003.

·         ABET goes into great detail about the program itself, and a lot of the information is reusable for the BOR.

·         It would be desirable to coordinate with ABET review—accreditation in the fall with other reviewers coming in February/March wouldn’t be too difficult.

 

C. Comments on the Process

·         Difficult getting people to create/update all pieces and then to edit the document to make it useful.

·         Difficult getting errors in data portfolio corrected.

·         Data portfolio does not provide answers to some of the data-oriented questions.

·         Integration of the data portfolio components into the self-study report should be simplified.

·         The space data was incorrect.

·         There is a need to rethink the by-major analysis from the freshmen level since many freshmen don’t know what they want to major in and oftentimes change their majors. Instead, take a look at freshmen from an institution-wide level. It doesn’t make sense to look at individual programs because it doesn’t deal with major changes effectively. First-year retention isn’t a major specific issue since relatively few courses are in a freshman student’s major).

·         Retention analysis of majors should start after freshman year and explicitly deal with major change (in and out).

·         A visit and report by the review team is extremely valuable, and it is important for the College to maintain control of this process, including what reviewers they bring in.

·         The accreditation process occurs in fall; it wouldn’t be too difficult to have the review team come in February or March.

 


VI. What was learned in the process?

·         Programs that have had accreditation reviews don’t have as much trouble completing the program reviews. It’s a matter of moving data around.

·         Need to figure out how to balance out the value of the process with the effort of the process.

·         The presentations indicate that the external review process has the most value.

·         Make the template tabular and only comment when there is something meaningful to say.

·         See if there is a way to coordinate with the accrediting process, e.g., Management, and rewrite a comparable document.

·         Engineering, as an example, no one wants to do ABET in the fall and review all nine programs in the spring.

·         Have advisory board come in every year with documents prepared in June. Everything gets over with in one year. Information goes to the dean at one time instead of piece by piece.

·         Data issues—incompatibility of college level vs. unit level (Architecture).

·         Need to look at the programs up for review next year and get data profile generated as soon as possible, which would facilitate writing of the self-study

·         Support—Some units need support to bring in reviewers. Could be done at the dean’s level.

·         BOR is requesting a summary. They are driving the process with their questions.

 

 

 

 

 

 

 

 

 

 

/sw