NIC_2015-2016_PROSASRIE-BL_v02_M
Sustainable Water Supply and Sanitation Project Impact Evaluation 2015-2016
Baseline Survey
Evaluación de Impacto del Proyecto de Sostenibilidad del Sector de Agua y Saneamiento Rural (PROSASR) - Línea Base 2015-2016
Name | Country code |
---|---|
Nicaragua | NIC |
Impact Evaluation Survey
Sample survey data [ssd]
This survey targets 5 units of analysis, 3 of which are heavily interlinked and information on which may be provided by the same respondent:
A survey to the Providers of Technical Assistance or PAT at the municipal level. These individuals are known as UMAS (Unidad Municipal de Agua y Saneamiento) or UTASH (Unidad Territorial de Agua, Saneamiento e Higiene) and are responsible for providing technical assistance to the Community Water and Sanitation Committees known as CAPS (Comités de Agua Potable y Saneamiento).
A set of three surveys are directed at the service providers themselves at community level: Comites de Agua Potable y Saneamiento or CAPS. These surveys consist of:
a) A community survey - can be responded to by the community leader or by the President of the CAPS
b) A system survey - usually responded to by the plumber or technical system operator, if there is one, or the President of the CAPS
c) A service provider survey - usually responded to by both the President of the CAPS and the accountant or treasurer
Version 1.3: This version of the data is identical to version 1.2, but the datasets have been anonymized.
The datasets have been anonymized, meaning that data for variables that contain identifying information like latitud/longitus or GPS coordinates have been set to missing.
The scope of the PAT survey includes:
a) basic information on the PAT - what kind of organization it is, how many communities it has assigned to it, how many have solicited and received support in the past years
b) the financial, human and logistical resources available to the PAT, including whether it has budget and vehicles to travel to the communities
c) their perspectives on the trainings they receive and would like to receive
d) their perspectives on the needs of the CAPS from the perspective of the UMAS
The scope of the 3 CAPS surveys include:
a) The community survey: how many households there are in the community, the main language and ethnicities, schools and health center WASH facilities, sanitation facilities coverage in the community, hygiene in the households and rubbish collection.
b) The system survey: this contains questions on the state of the water system infrastructure - identifying its different components such as the source, tank, distribution network and whether there is water treatment in the system. It also assesses water quality and accessibility as well as the type of system and the number of households it serves.
c) The service provider survey: this contains questions on the CAPS or committee's legal status, whether it is formalized, how representative it is, the tarriffs they charge, whether they have micrometers installed, and a variety of economic information, their operation and maintenance activities, environmental protection and sources of financing for the initial construction of the system.
Rural areas only
The starting point was a universe of 6,862 communities across 148 municipalities that were in the SIASAR database, and constituted almost the entire country. Community eligibility criteria were introduced in order to focus the evaluation on communities that would most likely improve their indicators with the intervention, given their initial status. These criteria did not introduce any sort of bias or non-random criteria. The eligibility criteria were:
• The communities had to have a total number of between 20 and 1000 households. A community was excluded if it was classified as having fewer than 20 households in the SIASAR database (for logistical purposes). Similarly, communities classified as having greater than 1,000 households were excluded as they likely did not meet the Government of Nicaragua's definition of a rural community (Population < 5000 people, assuming 5 people per household).
• To be eligible, a community had to have at least one existing WSS system. We were able to gain this information from SIASAR which was used as a basis for the sample selection.
• Communities that shared a system between them were excluded so as to avoid the potential for contamination between treatment and control. This took us to 2,489 communities.
• Those communities with systems that were included had to have a SIASAR infrastructure rating for the system of greater than or equal to 0.4 on the SIASAR infrastructure index (EIA) . An infrastructure score lower than 0.4 would correspond to a system in severe disrepair, and unlikely to experience any impacts from the planned intervention being evaluated, which focuses on capacity building rather infrastructure enhancements.
• Although communities without system infrastructure were excluded for the same reason, those without a service provider were included in the sample, as the technical assistance from UMAS specifically promotes the creation of CAPS where they have not yet been constituted as part of building up and supporting community-level WSS services sustainability.
• A community was excluded if it had an overall IAS score of >0.75, as those with high overall sustainability scores on the IAS index were already operating and providing services in a sustainable fashion and unlikely to experience detectable impacts from the intervention.
• A municipality was considered eligible if it had greater than 4 eligible communities, to allow for balanced stratified-randomization of communities (within each "poor" and "non-poor" municipality, respectively).
• Finally the survey company were permitted to place some logistical restrictions on travel to communities that would not fit within the baseline timeframe and the Government excluded the indigenous area of Alto Wangki and Bocay as they were to implement the intervention there regardless and did not want controls.
Name | Affiliation |
---|---|
Joshua Gruber | Centre for Effective Global Action (CEGA), University of Berkeley, California |
Name | Affiliation | Role |
---|---|---|
Christian Borja Vega | The World Bank Group | Role: Technical assistance in questionnaire design, water quality testing protocol |
Lilian Pena Weiss | The World Bank Group | Supervision of Field Coordinator and contractual arrangements with the survey firm, inputs to questionnaire design based on knowledge of intervention |
Clementine Marie Stip | The World Bank Group | Assistance in questionnaire design, assistance in coordination between the WB and Survey Firm, assistance and supervision of Field Coordinator´s activities |
Sophie Ayling | The World Bank Group | Assistance in questionnaire design, sampling methodology, data collection supervision, data processing and analysis |
Name | Role |
---|---|
Strategic Impact Evaluation Fund | Main Funder of the Impact Evaluation |
Fondo de Inversión Social de Emergencia | PROSASR implementing agency, in kind contributions through mobilization and facilitation of government contacts. |
Spanish Fund for Latin America and the Caribbean | Contributed through the SIASAR initiative to data quality monitoring. |
Sample size: The size of the eventual sample was 300 communities - 150 assigned to treatment and 150 assigned to control.
Selection process: The selection process of these 300 communities has largely been described in the section "Universe" above. Here the stratification is described in more detail.
Stratification and stages of sample selection: The sample of 300 communities was selected, across 75 municipalities and within each municipality the 4 eligible communities were randomized to either treatment or control (2 allocated to treatment and 2 allocated to control). This resulted in 150 treatment communities and 150 control communities evenly distributed across the 75 municipalities. Within the 150 treatment and 150 control communities, 100 communities come from 50 municipalities classified as "less-poor" and 50 communities across 25 municipalities that are "poor"
Random selection and random allocation were conducted in STATA 13. Municipalities were stratified (poor, less poor) and randomly ordered; the first 50 less poor and the first 25 poor municipalities in each list were selected for inclusion in the evaluation. Within each selected municipality all eligible communities were randomly ordered. The first two communities became control communities, and were automatically enrolled in the evaluation. The remaining randomly ordered communities (3 through to n) were all assigned to receive the treatment from the municipality; two communities from the treatment list (all communities below 3 in the original random ordering) were randomly selected to be enrolled in the evaluation (data collection) in a manner consistent with the random ordering described.
Levels of representation: As the original universe was from all the communities in the country registered in SIASAR (nearly every community nationally), and we then selected randomly half the municipalities in the country, our sample has a good level of representation. We also checked for balance on distribution across the treatment and control groups of the communities in the Pacific Coast, Central Region and Atlantic or Caribbean Coast.
Strategy for absent respondents/not found/refusals: Here we had to employ two different strategies, in the case of the CAPS and municipal surveys, we had the surveyors coordinate appointments prior to arrival in the districts and communities through FISE (the Government agency) that minimized the risk of refusals or absent respondents. However, in the cases where this did occur, surveyors would make appointments to come back on another day and progress with a different nearby community in the sample meanwhile.
CAPS refusals were very uncommon but in four cases where the surveyors were advised not to enter due to conflict in the community, we had a replacement strategy in place. From the random sample which had been ranked by the Principal Investigator, the next community in the list was chosen, all parties were informed, and the community visit reprogrammed. In total 8 communities were replaced in the sample, employing this strategy. The others were replaced due to logistical reasons or because there was no system found in the community, which had been one of the eligibility criteria identified in SIASAR. Another case identified was where there were no legalized formal CAPS in the communities but there was some kind of group that would administer the water system. This case was built into the structure of the CAPS survey and the communities were kept in the sample.
For households, a household would be rejected if the only person available to conduct the survey was under the age of 18 and if the person rejected the interview, the surveyor would also pass on to the next door house, then continue to follow the croquis that had been drawn out by the team leader in advance.
Listing exercises were not possible in this context as the Survey Firm was not able to obtain a list of household names prior to the fieldwork. However, a mapping strategy was drawn out which involved the pre-identification of households to be surveyed from a satellite image. Each household was assigned a number in the image or hand drawn map. The total number of households was divided by the number of households to be interviewed for determining the interval "n" separating households in the sample. A number between 1 and "n" was asked to someone; this household number was used as starting point of the route and the interval "n" was counted from the latest house accessed to determine the next house in the sample. This procedure was repeated to complete the total required interviews. For smaller communities where the sample required was larger, the intervals between households sampled corresponded to the total number required for collection. Please see table 3 of the baseline report for the number of households sampled per community.
As mentioned above, we had to interchange 8 communities from the original randomization sample due to conflict or due to them deviating from the eligibility criteria used to select the communities in the first place, which was based on SIASAR´s data. This was when the community did not have a community system and/or was too small to conduct the required number of surveys. Here is the full list of communities that were replaced and their replacements:
Originals: (Municipality COMMUNITY)
Bluefields SAN MIGUEL
Bluefields LOS PE?ONES
Granada TEPALON
Santo Tomas LA PITA
Waspan KURURIA
Waspan TRONQUERA
Waspan BUULKIAMP
Waspan UHRY
Replacements: (MUNICIPALITY, COMMUNITY)
MACUELIZO, EL JICARITO
MACUELIZO, OCOTE SECO
GRANADA, HORMIGON
SANTO TOMAS, MOLLEJONES
ACHUAPA, EL CONSUELO
ACHUAPA, EL MATAPALO
ACHUAPA, EL SALITRE
ACHUAPA, SAN ANTONIO 2
Household surveys:
Total number of HH surveys requested in sample: 5000
Total number of HH surveys collected in the field: 4850
Total number of HH surveys rejected: 1
Percentage of HH that participated: 97%
PAT surveys:
Total number of PAT surveys required in sample: 75
Total number of PAT surveys collected in the field: 78
Percentage of PATs that participated: 104%
(comment: the reason for more PAT surveys being carried out than the original sample is due to replacement communities being selected from new municipalities where PAT surveys were carried out in original and replacement municipalities)
CAPS - Community surveys:
Total number of COM surveys required in sample: 300
Total number of COM surveys collected: 299
Percentage of COMs completed: 0.997%
CAPS - Prestador surveys:
Total number of PES surveys required in sample: 300
Total number of PES surveys collected in field: 292
Percentage of PES completed: 0.97%
CAPS - Sistema surveys:
Total number of SIS surveys required in sample: 300 (or more, depending on whether there were multiple systems per community)
Total number of SIS surveys collected in field: 312
Percentage of SIS completed: 104% (more systems found in each community than originally anticipated)
Water samples: Ecoli
Total number of samples expected: 1000 over 150 communities
Total number of samples taken: 805 over 164 communities
Percentage of samples expected that were taken: 81%
Percentage of communities where samples expected where taken: 109%
Explanation: Complicated procedure resulted in confusion on the ground regarding communities that should be sampled
Water samples: Chlorine
Total number of samples expected: 460 samples over 300 communities
Total number of samples taken: 76 samples over 30 communities
Percentage of samples expected that were taken: 17%
Percentage of communities where samples expected where taken: 10%
Explanation: Samples were only taken where there was chlorination in the system in the last 5 days. In only 10% of the communities was there chlorination in the system in the previous five days therefore such a limited number were taken.
We conducted a randomized control trial and are looking for impacts within our sample, therefore this is not applicable.
As previously mentioned the IE baseline consisted of 5 surveys: Technical Assistance Provider (PAT), Water and Sanitation Committees at community level (CAPS) of which there were 3 modules, and Household (HH) Surveys. There were also water quality (WQ) tests carried out in ecoli and chlorine as part of the CAPS - System and HH surveys. These acronyms shall be used throughout this document. All questionnaires were in Spanish.
The PAT and CAPS questionnaires are all based most heavily on the SIASAR rural information gathering system, described fully in the submitted baseline report. The team has added questions that incorporate relevance to the impact of PROSASR, building in linkages between the supply and demand side of technical assistance, asking questions on trainings provided and received at each level. The PAT questionnaire was administered in each municipality, whilst the CAPS 3 modules were administered in each of the 299 communities visited (1 was lost from the sample). The household surveys were administered in a random selection of households within each community visited, identified according to the mapping strategies described in the section on "sampling procedures" above. Household surveys were administered with the head of the household and where that was not possible, a member of the household over 18 years of age.
The household surveys were designed by the Principal Investigator based on unpublished surveys previously written with colleagues from the WSS sector.
A full description of the survey modules is provided in the section "Scope" earlier on in this form.
All surveys were written in Spanish though some questions of the Household surveys were translated from English. Once the questionnaires were piloted, the Bank in conjunction with the survey firm implemented further changes to improve the phraseology to be better understood in the local context, whilst ensuring to keep the meaning behind the original question.
All questionnaires received design inputs from all members of the World Bank team mentioned, in particular Lilian Pena Weiss (Senior Water and Sanitation Specialist, World Bank), Christian Borja Vega (Economist, World Bank) and Clementine Stip (Junior Professional Associate, World Bank) on the CAPS surveys, Joshua Gruber (Principal Investigator, University of Berkeley) for the household survey design, Fermin Reygadas (Water Quality Specialist) for the design of water quality testing protocols). Sophie Ayling (Field Coordinator, World Bank) and Florencia Rodriguez (Field Coordinator, ESA Consultores) were responsible for incorporating these changes, making contextual adjustments and ensuring they were implemented in the field according to original objectives.
Start | End | Cycle |
---|---|---|
2015-11-16 | 2015-12-19 | Wave 1 |
2016-01-04 | 2016-01-30 | Wave 2 |
Name | Affiliation |
---|---|
Sophie Ayling | The World Bank Group (WB) |
Florencia Rodriguez | ESA Consultores (ESA) |
Enumerators were organised into seven teams of four. Each team included a team leader who was responsible for the supervision of the team, provision of information on where water quality samples were to be taken, household surveys to be taken and collection of the surveys and water quality samples in each community to be passed on to the Local Field Coordinator in Nicaragua. The local field coordinator for ESA reported to ESA's field coordinator based in Honduras, who travelled regularly to the project area. Within the team, one surveyor was the technical surveyor who carried out the CAPS, SIS and COM surveys as well as the water quality tests of system and those collected in HH. Meanwhile the other two members of the team were responsible for conducting the household surveys and collection of water samples and tests in households. The team leader was responsible for mapping the households where the HH surveys were to be collected. If s/he finished this early, s/he was responsible for conducting also the community survey to assist the technical surveyor who had a larger workload.
With regards to field visits by the staff mentioned under "Data Collections", four one to two week visits were carried out during the three months of piloting, training and data collection that were carried out between November and January 2015. Additionally a local World Bank consultant was employed in January to oversee the re-collection of several water samples which from monitoring activities it became clear were not meeting their targets.
WAS A TRAINING OF ENUMERATORS HELD? WAS A PILOT SURVEY CONDUCTED?
Prior to a Baseline Launch, a pilot was carried out from 19th-26th October in which first a more basic training and second, field testing of the surveys was carried out. During the course of the training and piloting significant scope for adaption of the surveys and instructions was found for material provided. Inputs were also received from the government partners on how they wished the surveys to be adapted or language to be improved. Excessive length in CAPS surveys was also identified, as they took up to 3 hours for that one component, when it was originally anticipated that this element would take 1-2 hours. We were able to then identify the highest priority questions and cut down in consultation with different experts in the team.
Following the pilot, a re-training of enumerators was held prior to the start of formal data collection for 5 days from the 9th-15th November and 2 days from 16th-17th November. The presentations from this training are enclosed with this data submission form.
During the main training of the first week, we carried out 3 days training in "classrooms" and 2 days field practice in 2 communities in Managua (Las Lajas and El Trapiche), both approved by the government partner. The training included an introduction to the project context, each of the surveys individually and roleplays demonstrated by different members of the World Bank and Survey Firm Team followed by practice in pairs with different scenarios. The surveyors also received presentations from the FISE government staff to provide the context to each of the surveys being carried out. In order to reach pilot communities, the surveyor team arrived with ARAS coordination and assistance. The following week, there was an additional training in carrying out water quality tests on 16th and 17th November provided by a water quality expert from the ESA team who had received inputs and guidance from an expert consultant from the World Bank on ecoli and chlorine testing. All of the trainings were carried out in Spanish and involved hands on practice with the equipment to be used for water quality tests.
ANY EVENTS THAT COULD HAVE A BEARING ON DATA QUALITY?
One of the WB consultants who was brought in to provide the water quality training was unable to make it in person to the training and had to join virtually providing the training over skype. Whilst we had one member of the ESA team on hand to provide training in person, the team on the ground was a little short staffed in the water quality testing training and supervision. In particular, the number of samples to be taken in each of the sub-set of selected communities and at different points in the system was quite complicated and led to some confusion in sample collection in the field. We were able to bring in a local field monitor towards the end of the fieldwork to ensure re-collection in communities where samples had been missed. There were significantly less chlorine samples collected than originally anticipated and budgeted for but field supervisors and monitors were able to verify that this reflected the lack of chlorination in rural water systems.
HOW LONG DID AN INTERVIEW TAKE ON AVERAGE?
By the end of the pilot, we were able to cut down the surveys to be 1.1-2.5 hours for the CAPS survey, 1.5 hours for the System survey (including travel time to different elements but excluding WQ), 1 hour for the community survey. HH surveys were able to get down to 15 minutes and finally the PAT survey at municipalities was between 40 minutes to 1 hour long. This was all in accordance with what had been agreed with the survey firm. The survey firm maintained that they would be able to cover all the communities with between 1.5 and 2 communities per day between their 7 survey teams.
WAS THERE A PROCESS OF NEGOTIATION BETWEEN HOUSEHOLDS; THE COMMUNITY AND THE IMPLEMENTING AGENCY?
In order to gain approval for the fieldwork, FISE (the government partner) required the Survey Firm to send out a program of the communities to be visited for 15 days of fieldwork at a time. This enabled all partners to ensure that on arrival in the municipalities and the communities, the respondents were there and available to conduct the interview.
ARE ANECDOTAL EVENTS RECORDED?
Any changes that were required and events during mission visits by the WB Field Coordinator to supervise the fieldwork are recorded in email correspondence and AMs shared internally with the WB team and later dispersed to the Survey Firm where actions were required. The replacements of communities that were removed from the randomization have been recorded in STATA dofiles and are kept in a box folder within the team´s archives on the WB system.
HAVE FIELD TEAMS CONTRIBUTED BY SUPPLYING INFORMATION ON ISSUES AND OCCURRENCES DURING DATA COLLECTION?
Yes, there was a constant line of communication maintained between the WB Field Coordinator and the Survey Firm´s Field Coordinator to ensure that unforseen circumstances were properly addressed and solutions found. This was particularly the case for identifying replacement communities where the original requirements were not met. These communications are documented in email correspondence and in the Survey Firm´s final report. The WB Field Coordinator also required the survey firm to provide weekly monitoring reports in excel which would document the number of surveys and water quality samples collected in each of the sample communities. She would then analyze this against the number that were meant to be collected before passing feedback back to the WB team and survey firm. When the Field Monitor was employed in the final weeks of data collection for the water quality check ups, she was also able to provide further reporting to pick up on errors, feeding back to the Field Coordinator who was able to ensure that action was taken by the survey firm´s field management staff. This was in particular with regards to the number of water samples taken and where they were supposed to be collected as well as identifying inconsistencies or misinterpretations in the questions being asked in CAPS and system surveys.
IN WHAT LANGUAGE WAS THE INTERVIEW CONDUCTED?
All interviews, training and piloting were conducted in Spanish
WERE THERE ANY CORRECTIVE ACTIONS TAKEN BY MANAGEMENT WHEN PROBLEMS OCCURRED IN THE FIELD?
Yes, as identified and elaborated on in the risks section of the word baseline report, during the last few weeks of the fieldwork, upon identification of missing data in water quality samples, the WB hired an additional local consultant to conduct monitoring of water samples. This resulted in the recollection of missing samples in selected communities, which the local monitor was able to supervise and report back to the Field Coordinator on improvements. A re-training was conducted by the ESA Field Coordinator and every effort was taken to ensure that the number of water samples was increased to be much closer to the targets of those requested under the terms of reference.
The data editing should contain information on how the data was treated or controlled for in terms of consistency and coherence. This item does not concern the data entry phase but only the editing of data whether manual or automatic.
The data processing objective was that data entry into the databases was a faithful reflection of the data written on physical questionnaires. When data entry reports displayed errors or discrepancies, group supervisors were able to give feedback to enumerators to correct or complete data reported as inconsistent. Meanwhile supervisors would call back in to the survey respondent either by phone using contact details provided as part of the survey or in person if supervisor/quality control staff were still in the area. Sometimes these telephone calls were made by the Field Manager. All data recovery was registered in a different color (blue or red) in the questionnaire to track how the information was obtained (during the visit or through a back-check).
The data entry program generated a series of tables in format Fox Pro, which were imported into SPSS format using the software StatTransfer v12, for documenting them. Missing values were also coded and documented in SPSS. For additional information PROSASR_CODEBOOK_ESA_BASEDAT.PDF can be consulted. No data was edited nor missing values were imputed.
Some data review was carried out in the coding of water systems and PES/CAPS as well as in classification of systems due differences in type of system found during the survey compared to data recorded in the SIASAR´s archives. Some codes of systems were recoded in accordance with the guidelines given by the World Bank´s project team.
Documented databases were exported to Stata format, to conform with the final databases; frequencies were obtained to check if it met the flow of the questionnaire. The relational integrity of the databases was verified generating lists with possible inconsistencies, which were contrasted with the physical questionnaires and any inconsistencies in data entry were corrected.
These databases were then checked for consistent coding and cross-checked for merge compatibility by the WB Field Coordinator with inconsistencies sent back to the Survey Firm and corrected before submission.
Not applicable because not a sample survey.
During the survey piloting, Gonzalo Martinez, specialist on the SIASAR information system, which was heavily relied on in survey design, supported the team in supervising training and accompanying pilot data collection.
During the application of surveys in the field, the Field Coordinator Sophie Ayling carried out random checks with the help of Eliette Gonzalez, who was contracted specifically as a data quality monitor.
Comparisons were made during the data collection with the SIASAR data which was being updated in each of the communities so the ESA field team are able to report inconsistencies between the two sources.
World Bank Microdata Library
Name | Affiliation |
---|---|
Laura Natalia Becerra Luna | The World Bank Group |
Name | Affiliation | URL | |
---|---|---|---|
Strategic Impact Evaluation Fund | The World Bank Group | https://www.worldbank.org/en/programs/sief-trust-fund | siefimpact@worldbank.org |
Christian Borja Vega | The World Bank Group | cborjavega@worldbank.org | |
Lilian Pena | The World Bank Group | lpereira1@worldbank.org |
Public Access
Use of the dataset must be acknowledged using a citation which would include:
Joshua Gruber, University of Berkeley, California, Christian Borja Vega, Lilian Pena Weiss, Clementine Marie Stip and Sophie Ayling, World Bank Group. Nicaragua Sustainable Water Supply and Sanitation Project Impact Evaluation 2015-2016, Baseline Survey, Ref. NIC_2016-2015_PROSASRIE-BL_v02_M. Dataset downloaded from [url] on [date].
The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.
(c) 2016, The World Bank
Name | Affiliation | |
---|---|---|
Strategic Impact Evaluation Fund | The World Bank Group | siefimpact@worldbank.org |
Lilian Pena Pereira Weiss | The World Bank Group | lpereira1@worldbank.org |
Christian Borja Vega | The World Bank Group | cborjavega@worldbank.org |
Clementine Marie Stip | The World Bank Group | cstip@worldbank.org |
Joshua Gruber | University of Berkeley, California | jsgruber@gmail.com |
Sophie Ayling | The World Bank Group | sayling@worldbank.org |
DDI_NIC_2015-2016_PROSASRIE-BL_v02_M_WB
Name | Affiliation | Role |
---|---|---|
Development Economics Data Group | The World Bank Group | Documentation of the study |
2023-09-27
Version 02 (September 2023)
This version of the study documentation is identical to version 01, but includes the deidentified versions of the datasets that were previously submitted.
This site uses cookies to optimize functionality and give you the best possible experience. If you continue to navigate this website beyond this page, cookies will be placed on your browser. To learn more about cookies, click here.