WA Workforce Investment Act Customer Satisfaction Survey (WA WIA)


This was the largest multi-year project at CSRA ($450,000 in three years). This federally mandated survey evaluated the satisfaction of participants and employers of various programs funded by the Workforce Investment Act in the state of Washington by using American Consumer Satisfaction Index (ACSI) developed by the University of Michigan. The funding for regional offices (WDCs) were tied to the satisfaction index. If the index falls below a threshold, funding would be in jeopardy.

We contacted nearly 1200 households and businesses in each month in this multi-mode i.e. phone, web, fax and email survey to achieve 70% response rate which was mandated by the contract. In order to improve response rate, I coordinated with 12 regional offices (WDCs) in Washington to obtain contact information for unreachable participants. Twenty-five programs in SAS, SPSS, Excel and Access were involved in coding, analyzing, calculating and reporting the results. The results were uploaded into a database-driven website within 24 hours. As the project manager on file, I was able to win this project back to CSRA in 2007 when it was opened up for bidding. In the same year, we achieved the best performance thanks to our excellent interviewers and the support I received from the WDCs.

One of the creative ways I improved the response rate in 2007 was by introducing online surveys that asked the specific questions which were used to compute the ACSI.


National Trial Court Judges Survey


This is the first multi-mode project at CSRA ($85,000) after I introduced in-house web surveying capabilities in collaboration with the university IT Department. We used an open-source Unit Command and Climate Assessment Survey System (UCASS) to program the surveys.

The National Trial Court Judges survey was conducted to learn about judges’ decision-making process, expectations and practices. My background in the law school was very useful to provide feedback to the questionnaire; understand this population; instruct phone interviewers and write a technical guide for open-ended answers. The other significance of this project was, irrespective of the busy nature of the target population, we were able to achieve 50% response rate within five months (1300 judges responded). It was achieved by: a) allowing respondents to participate in many modes b) systematic reminders; timely follow-ups; prompt correspondence to inquiries with carefully written scripts c) adhering to the highest degree of professionalism and courtesy d) minimizing the hassle participants have to undergo to complete the survey e) having the survey endorsed by a reputable organization i.e., US Department of Justice.

The introduction of in-house web surveys helped cut the cost of online survey by more than 80% at CSRA. The Trial Court Judges Survey was the first online survey CSRA conducted. Prior to this survey, CSRA spent over $5000 for hosting an online survey by an outside vendor. This cost was reduced to less the $1000.


Social Studies Study


This survey covered topics such as the focus of teachers’ social studies instruction, their views on the goals of social studies education and on the quality social studies textbooks. In 2005, the sample of this study was designed to be a representative sample of teachers of second, fifth and eight grade social studies in regular public schools whereas in 2007, it was a sample of teachers in public high schools. The sample was developed using a multi-stage stratified sampling methodology. In the first stage, national public school sample was developed by using National Center for Education Statistics (NCES) Common Core Data database. The sample was stratified by size, region and location. Sufficient replacement schools were also identified for each stratum. In the second stage, the sample of schools was contacted and was requested to randomly identify a teacher. The members of the sample were contacted via a phone survey. In 2007, all public high schools in the U.S. were selected. One teacher per school was randomly selected with strict replacement within the school. The teachers were given the option either to participate via phone or web. Post-stratification weights were applied to make the sample representative of high school teachers in the U.S.


Journalist survey


This survey asks questions about journalists’ knowledge of the first amendment and constitutional rights, importance and performance of the news media, role of new media, confidentiality of news sources and ethics of the journalists. The challenge of this research was to construct a sampling frame of newspaper and television journalists. Through on-line research, I found out that we can develop the sample frame by using the Bacon’s (now Cision) Media Source Premium Research Module database. Once contacted, the company informed that UConn was one of its clients. And, I was able to get access to this database through UConn’s Media Communications unit for free of charge as a result of our collaborations with the Public Relations Department through UConn Perceptions Survey .

In the target population, newspapers and televisions roughly represented two-thirds and one-thirds proportions, respectively. However, to compare and contrast the attitudes between television and newspaper journalists’ television sample was over-sampled. The dataset was later weighted to represent true population distributions. The source for post-stratification weighting was the Bureau of Labor Occupational Outlook.


Independent two-stage sampling process was conducted. In the first stage, newspaper and television organizations were selected, stratified by estimated size (circulation and number of journalists presumed to be proxies of the size of the organization). Some organizations were self-selected. Proportion of the organizations in each stratum was roughly equal. In the next stage, journalists in these organizations were randomly selected. A strict sample replacement was applied. These journalists made up the sample for this study.


National Environment Literacy Assessment (NELA)


The client contacted CSRA because of our expertise in post-stratification weighting. They had already gathered data but were unable to mirror it to the target population. Complex, confusing and contradictory sampling documentation made the weighting process difficult. A number of conversations with the client; analysis of technical documents and sample itself unveiled the sampling strategy. The sample has been constructed as a multi-stage probability proportionate to size (PPS) cluster sample. In the first stage, PPS sampling ensured larger counties were represented in the sample. In the second stage, schools were selected followed by the classes. The disproportionate probability of selection has to be rectified by weights. Post-stratification weights of enrollment (size), race and gender were applied to make the sample mirror its target population. A raking weighting process was applied. A number of iterations were carried out until the compounding weights were converged. National Center for Education Statistic’s (NCES) Common Core Data (CCD) was used to obtain population parameters.


Care 4 Kids Child Care Rate Setting


The CSRA was contracted to develop a methodology leading to the identification of the payment (subsidized) rates for the Care 4 Kids child care assistance program. The child care payment rates were based on this scientific survey which measured the actual rates charged by the providers. The survey was required to achieve over 80% response rate to reduce any sampling bias. The sample for this survey was designed to optimize the accuracy of rate estimates across different types and across different regions in Connecticut. All licensed facilities were eligible for inclusion in the survey sample. Child care centers and family day care homes were randomly sampled according to scientific sampling procedures. Rates were based on 3 factors: 1. Statewide rate for childcare for each age group and facility type 2. Regional rate adjustment 3. Partial, full- and part-time adjustment. These factors were incorporated to calculate the payment rates by using an iterative weighting procedure. Several percentiles of payment rates were also computed to select a desired level of subsidy payments. My role was to produce the weights and compute child care payment rates in 2005 by improving the methodology.


CT Workforce Investment Act Customer Satisfaction Survey (CT WIA)


This is similar to the WA WIA except it is smaller in scale—2000 completes per year—and it did not involve analyzing or interpreting data. We conducted monthly surveys of participants and employers (about 200 combined); cleaned and recoded the datasets and sent those to the client. It was not multi-mode. There was no process to obtain contact information of the unreachable participants even though I recommended it to improve the response rate. We calculated ACSI. In the end, I conducted outlier analysis to weed out irregular or invalid responses due to interviewer’s error. I combed interviewer feedback forms, talked to phone room managers and read interviewer comments in the dataset to identify any invalid responses. The process also required to visually analyze the dataset for possible discrepancies. 


University of Connecticut Health Center Study


This was the 10th round of a trend study of Connecticut resident’s perceptions of the Health Center. The questionnaire explored: awareness and familiarity; Health Center’s image; use of the Health Center (e.g., cardiac, cancer, orthopedic, dentistry) and its ratings against other Hospitals (e.g., Yale New Haven, Hartford, St. Francis) and advertising and publicity. As defined by the Health Center, three markets (primary, secondary, statewide) stratified the sample.  The findings were compared and contrasted across three markets.


Dental Implant Study


This was a study of UConn Health Center’s primary and secondary market areas to identify the potential market share for dental implants and evaluate the experiences of those who have dental implants.  This is a pre-emptive study laying the groundwork to establish a dental implant center. The sample of the study was patients who had dental implants.  The questionnaire explored: referral process; satisfaction of a previous implant; familiarity with the Health Center and its dental practice and demographics of dental implant patients. Following this study UConn Health Center established the Center for Implant and Reconstructive Dentistry.


 University of Connecticut Perceptions


This was a trend study to assess the perceptions of the University of Connecticut (image study) by UConn alumni and the Connecticut general public.   The survey had been conducted annually since 1996.  The study included three groups—Connecticut alumni, national alumni, and the Connecticut general public.   The survey sampling method for the alumni changed in 2007. To help the UConn Alumni Relations Office collect data on particular alumni issues, the national alumni sample was increased in 2007, which included an over-sample of alumni association members.  In 2008, the national alumni sample size reduced, which still included an over-sample alumni association members. The final national alumni sample was weighted to reflect true population parameters by member status and geographic region.  The Connecticut alumni sample was weighted by member status. The Connecticut general public sample was weighted by age, gender and education following American Community Survey (ACS) population distributions.


Baylor University Perceptions

The Baylor Perceptions mirrored the methodology we employed for the UConn Perceptions survey above.


State of Connecticut and Minnesota Gubernatorial Elections


CSRA conducted election polling for the states of CT and MN. I was involved in the complex post-stratification weighting for the samples. In order to mirror the samples by the characteristics of the voting age populations of CT and MN, the survey samples were weighted by age, gender, race and level of education. As policy CSRA did not weight samples by party affiliation.


Strategic Initiatives Survey of Faculty, Staff and Students


Continuous Strategic Initiatives Survey


Strategic Initiatives Local Planning Survey


Enrollment Research Feedback Survey


Enrollment Survey