BFA_2007_MCC-TB_v01_M
Threshold BRIGHT I 2007-2008
Name | Country code |
---|---|
Burkina Faso | BFA |
Impact Evaluation Survey
Sample survey data [ssd]
Individuals, Households, School.
Anonymized dataset for public distribution
Household questionnaire:
School questionnaire:
Topic | Vocabulary |
---|---|
Education | MCC Sector |
Name | Affiliation |
---|---|
Mathematica Policy Research | Millennium Challenge Corporation |
Name |
---|
Millennium Challenge Corporation |
The sample frame comprised all households within the 293 villages that applied to the program, including all of the villages in the study's participant and comparison groups. Data collectors, however, were unable to locate two villages, probably because the spelling of the village names on the application did not match village names found by the data collectors-perhaps due to dialect differences or misspellings. As a result, the survey included 291 villages, of which 132 were participant villages and 159 were comparison villages. [Note: The analysis file excludes four additional villages. Two were excluded because they were the only villages that applied for the program from their department and thus were not eligible for the analysis used. The other two villages were excluded because no data were reported for them. Therefore the dataset includes data on 287 villages.]
HOUSEHOLD SAMPLING
In each village located, interviewers conducted a census of households to develop a village-level sampling frame. Households in the study are defined as groups of people living together (in a common physical space), working together under the authority of a person called head of household, and taking their meals together or from the same supply of food. The members of a household must have lived together in this fashion during at least 9 of the previous 12 months. During the census, interviewers identified households with school-age girls (5- to 12-years old) and collected information about the household's access to beasts of burden.
Following the census, the households with school-age girls became the sample frame, and 30 of these households were randomly selected to be surveyed in each village. The sampling frame at the village level was stratified by access to beasts of burden, a proxy for wealth. Three strata were identified: households that owned at least one beast of burden, households that did not own a beast of burden but had access to one, and households that neither owned nor had access to a beast of burden. Under the hypothesis that means of production are positively correlated with income, the University of Ouagadougou suggested the above stratification method to ensure a representative household sample. From each stratum, interviewers selected 10 households to be surveyed. For each stratum, interviewers wrote the name of each head of an eligible household on a piece of paper, placed the pieces of paper in a hat, and then drew 10 names. The process was conducted publicly in each village.
SCHOOL SAMPLING
School data was collected using different sampling techniques for Wave 1 and Wave 2. For Wave 1, interviewers asked village elders to name all the schools, if any, that children from that village attended regularly. Interviewers then selected the up to three schools closest to the village center, within 10 kilometers, as the schools to be surveyed for that village. [Note: This strategy could have introduced sampling bias if villages had children attending more than three schools, and different types of schools were systematically located closer to village centers; however, in 98.7% of villages with any children attending school, only one or two schools were named. ]Data were collected from more than 300 schools.
For Wave 2, interviewers used the household data as the starting point. From the 8,491 completed household surveys, children were identified as currently attending 367 schools. Of those, 316 were attended by three or more children in the sample. Of those, schools that were within 10 kilometers of the children's village were targeted for interview. Data from more than 280 schools were obtained, and matched with 7,316 of the children in the sample.
Wave 1 school data were also matched with children in the household sample. Data from 270 schools were matched with more than 7,400 children in the sample. Only data from the matched schools are found in the dataset.
As described above, we were unable to survey four of the 293 applicant villages in our household survey. In addition, two villages were the only villages in their department, making it impossible to create the relative score variable needed for the RD design. As a result, we dropped these six villages from consideration in our analysis and focused on the 287 villages for which we had meaningful applicant and household survey data.
The response rate for the household survey was 97.3 percent. This was calculated by dividing the total number of households who responded (8,491) by the number of households sampled for the located villages (8,730). Two unlocated villages were not included in this calculation.
The response rate for the school survey is 99.2 percent. This was calculated by dividing the total number of schools who responded (367) by the total number of schools identified in the household survey as having children enrolled (370).
Eligibility weights applied to application forms submitted by each of the 293 villages:
QUESTION SCORING
N°1 ................................................. 1 point per girl
N°2 ................................................. 1 point per girl
N°3 ................................................. 1 point per girl
N°4 ................................................. +1 point if between 0 and 5 km and- 1 point for 6 km or more
N°5 ................................................. 1 point per student
N°6 ................................................. +1 if there are no rooms and -1 if there are
N°7 ................................................. +1 for each village between 0 and 5 km and -1 for each village of 6km or more
N°8 ................................................. -1 for each existing school and +1 if there are none
N°9 ................................................. +1 if between 0 and 5 km -1 if 6 km or more
N°10 ............................................... 1 point per girl
N°11 ............................................... +1 if between 0 and 20 km and -1 if 21 km or more
N°12 ............................................... + 1 per student
N°13 ............................................... Not included in scoring
Mathematica developed two questionnaires: a household questionnaire and a school questionnaire. The household questionnaire asked about household demographics, children's educational outcomes (enrollment and attendance), and parents' perceptions of education. The school survey asked about schools' characteristics and children's attendance and enrollment.
The household questionnaire drew heavily from several existing questionnaires used widely in developing countries, including the Demographic and Health Survey (USAID), the Multiple Indicator Cluster Survey (UNICEF), and the Living Standards Measurement Study (World Bank). Reliance on these questionnaires provided two important benefits. First, given their wide and successful use in developing countries, including Burkina Faso, they enhanced our confidence in the validity and reliability of the questions in the household questionnaire. Second, reliance on the existing questionnaires allows researchers to compare our results with results from the earlier surveys in both Burkina Faso and other countries. Where necessary, we adapted or added survey questions to yield detailed information to answer the research questions. The household survey included the following modules:
Household characteristics. This module asked for information about the head of household, such as religion, ethnicity, and education; information about the household itself, including GPS coordinates, construction materials, and water source; and intervention-specific information, such as whether any children were attending preschool (Bisongo) or whether any women were participating in literacy training.
Household listing form. This module asked the respondent to provide a complete list of all children between 5- and 12-years-old residing in the household. Basic information collected on these children included relationship to the head of household, sex, age, and whether the child had attended school at any time during the 2007-2008 school year.
Education. This module was administered for all children 5- to 12-years-old who attended school at any time during the 2007-2008 school year. Questions addressed access to textbooks; information about the school attended, including specific interventions such as separate latrines, participation in feeding programs, and attendance; and reasons the parents sent the child to school.
Child labor. This module was administered for all children 5- to 12-years-old, and asked whether the children were engaged in work for persons outside the household (for pay or in-kind) and whether they performed various chores.
Mathematics assessment. This module was administered to all children 5- to 12-years-old. Children were shown pre-printed cards and asked to identify numbers, count items, indicate which number was the greater of a pair of numbers, and perform simple addition and subtraction.
French assessment. This module was administered to all children 5- to 12-years-old. Children were shown pre-printed cards and asked to identify letters, read one- and two-syllable words, and identify the correct noun and verb from a list to fill in a blank in a simple sentence. Examples came from grade 1 and 2 Burkina Faso primary education reading texts.
The school questionnaire was based largely on the World Bank's Living Standards Measurement Study School Questionnaire, modified to address Burkina Faso's educational context and answer the evaluation's research questions. The school survey was administered in two waves. The first wave collected information on school characteristics. The second wave, conducted about five months later along with the household survey, collected attendance and enrollment data for children interviewed in the household survey. Accordingly, Mathematica created two school questionnaire forms. The first included detailed characteristics about the school and a roster to collect overall attendance data. The second included only an attendance roster for students enrolled in the study. Together, the school surveys included the following modules:
Both the household and school questionnaires were first written in English and then translated into French. Mathematica and the University of Ouagadougou collaborated on the translations, ensuring that idiomatic expressions or language usage particular to Burkina Faso was appropriately incorporated. However, in reality French is rarely spoken in rural villages. There are currently 68 languages spoken in Burkina Faso, of which several are unwritten or inconsistently written (Gordon 2005). Faced with the prospect of surveying people in many languages, Mathematica decided that the best approach was to hire interviewers fluent in both French and local languages and train them to translate the instrument as they conducted the interview. In Table C.1, we present the native language of respondents to the household survey.
Household Questionnaire Respondent Native Language / Frequency / Percent
French / 178 / 2.1
Mooré / 3,145 / 37.1
Dioula / 33 / 0.4
Fulfudé / 1,782 / 21.1
Gulmachéma / 2,345 / 27.7
Bwamu / 140 / 1.7
Other Language / 844 / 10.0
Total / 8,467 / 100.1
Once the questionnaires were developed, they were tested in a pilot data collection for which we randomly selected 10 villages-5 treatment and 5 comparison-to be surveyed in May and June 2007. Our aim was to survey households and schools in these villages in order to identify potential problems. The pilot called for interviewer training; conduct of a census and random selection in each village; the identification of schools; conduct of the household and school surveys; and data entry, cleaning, and delivery. A Mathematica team traveled with interviewers and observed them in several villages, talked with village residents, and held a debriefing session with interviewers.
The pilot test identified two key problems. First, the household interview was much too long, averaging more than 90 minutes. To reduce respondent burden, we decreased the number of questions to limit the interview to less than one hour. Second, we determined that several questions were difficult for respondents to answer, particularly those about distances, time, and space. For example, respondents struggled to answer questions about distance from the household to the school or the number of hectares farmed. For questions that we thought important for the analysis, we asked the interviewer for an estimate or sought other measures.[Note: Because both the household and school surveys were substantially modified following the pilot data collection, we did not use the pilot data for analysis. During subsequent data collection, however, all 10 villages included in the pilot data collection were revisited and included in the household and school survey.]
For the school survey, we concluded that it was nearly impossible during analysis to link the students on the school roster with children reported by the household survey as enrolled in school. The reason was the lack of a unique identifier such as a government-issued identification number and the fact that many children shared both the same first and last name. The matching procedure was important in that key measures for the evaluation were school enrollment and attendance. Accordingly, we grew concerned that using the household survey alone to measure school enrollment and attendance might lead to misleading results due to social desirability or other biases. As a result, we developed a procedure whereby matching took place while interviewers were in each village. For this procedure, interviewers first completed the household surveys and then populated the school attendance roster with the names of all children identified in the household surveys as enrolled in a local school. They included the child's household ID and household listing number on the roster. We later used these identifiers to link school data to household data. Once in the school, interviewers used the roster to collect attendance and enrollment information only for children on the roster.
Start | End | Cycle |
---|---|---|
2007-02 | 2007-04 | Pilot household survey in 10 villages |
2007-10 | 2008-04 | School surveys in 293 communities |
2008-01 | 2008-04 | Household survey in 293 villages |
Name | Affiliation |
---|---|
Jean Pierre Sawadogo, Robert Ouedraogo, and Pam Zahonogo. | University of Ouagadougou |
To carry out the data collection activities, Mathematica drafted and released an RFP to solicit proposals from local data collection firms. We received seven proposals; Mathematica interviewed representatives of three firms and ultimately selected a team of researchers from the University of Ouagadougou, led by Jean Pierre Sawadogo, Robert Ouedraogo, and Pam Zahonogo. The data collection firm was responsible for the following:
Before the start of each data collection, the university team conducted interviewer training that covered identifying schools, conducting a village census and selecting eligible households at random, basic interviewing procedures, and a review of each question to ensure that interviewers understood its intent. Interviewers then were organized by linguistic group and worked together to determine how best to translate questions into the local languages. Mathematica participated in the interviewer training.
The data collection consisted of the first school survey conducted in fall 2007 and the follow-up school survey conducted in spring 2008. The follow-up school survey was coupled with the household survey. The pilot test, described above, was conducted in late spring 2007. All versions of the interview were conducted with paper questionnaires.
The school survey was conducted with the school director, when possible. The interviewer also was asked to gather attendance information, particularly on the day of the visit. For that module, the interviewer called the roll and personally noted absences. As noted, 360 schools were surveyed.
The household survey was conducted with the head of household or another member of the household knowledgeable about household children. The interviewee most often was the male head of household. Ninety-eight percent of the interviews were conducted with men and 80 percent with the head of the household.
The university team hired 56 interviewers to collect household and school data. For the full household data collection, the interviewers were organized into 18 teams by linguistic group. Each team consisted of three to four interviewers and was led by an experienced field supervisor. The teams were then assigned to a cluster of villages. The teams simultaneously surveyed the selected villages throughout the country.
Following data collection, the data were entered and edited by the University of Ouagadougou team using SPSS statistical analysis software. Preliminary data sets were provided to MPR for extensive data checking. The MPR team reviewed the data for completeness, internal consistency, and to determine if the match between household and school data was done correctly. In particular, because of its importance to the central research question, we focused on reconciling data for children identified as being enrolled in school during the household interview but not found on the school attendance roster, and children found on the school attendance roster but not on any household survey. These errors occurred for a variety of reasons, including interviewers not following the procedure and illegible writing.
Name | Affiliation | |
---|---|---|
Monitoring & Evaluation Division | Millennium Challenge Corporation | impact-eval@mcc.gov |
Levy, Dan, Matt Sloan, Leigh Linden, and Harounan Kazianga. 2009. Impact Evaluation of Burkina Faso's BRIGHT Program. Washington, D.C.; Mathematica Policy Research, Inc.
The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.
Name | Affiliation | |
---|---|---|
Monitoring & Evaluation Division | Millennium Challenge Corporation | impact-eval@mcc.gov |
DDI_BFA_2007_MCC-TB_v01_M
Name | Role |
---|---|
Millennium Challenge Corporation | Metadata Producer |
2011-03-15
Version 1.1 (March 2011).
Version 2.0 (May 2015). Edited version based on Version 01 (DDI-MCC-BFA-MPR-BRIGHT-2009-v1.1) that was done by Millennium Challenge Corporation.
This site uses cookies to optimize functionality and give you the best possible experience. If you continue to navigate this website beyond this page, cookies will be placed on your browser. To learn more about cookies, click here.