The transition from socialism to a market economy has transformed the lives of many people. What are people's perceptions and attitudes to transition? What are the current attitudes to market reforms and political institutions?
To analyze these issues, the EBRD and the World Bank have jointly conducted the comprehensive, region-wide "Life in Transition Survey" (LiTS), which combines traditional household survey features with questions about respondents' attitudes and is carried out through two-stage sampling with a random selection of households and respondents.
The LiTS assesses the impact of transition on people through their personal and professional experiences during the first 15 years of transition. LiTS attempts to understand how these personal experiences of transition relate to people’s attitudes toward market and political reforms, as well as their priorities for the future.
The main objective of the LiTS was to build on existing studies to provide a comprehensive assessment of relationships among life satisfaction and living standards, poverty and inequality, trust in state institutions, satisfaction with public services, attitudes to a market economy and democracy and to provide valuable insights into how transition has affected the lives of people across a region comprising 16 countries in Central and Eastern Europe (“CEE”) and 11 in the Commonwealth of Independent State (“CIS”). Turkey and Mongolia were also included in the survey.
Kind of data
Sample survey data [ssd]
Downloaded from EBRD website on January 30, 2011
The LITS was to be implemented in the following 29 countries: Albania, Armenia, Azerbaijan, Belarus, Bosnia and Herzegovina, Bulgaria, Croatia, Czech Republic, Estonia, Former Yugoslav Republic of Macedonia (FYROM), Georgia, Hungary, Kazakhstan, Kyrgyz Republic, Latvia, Lithuania, Moldova, Mongolia, Poland, Romania, Russia, Serbia and Montenegro, Slovak Republic, Slovenia, Tajikistan, Turkey, Turkmenistan, Ukraine and Uzbekistan.
Producers and sponsors
European Bank for Reconstruction and Development
European Bank for Reconstruction and Development
A total of 1,000 face-to-face household interviews per country were to be conducted, with adult (18 years and over) occupants and with no upper limit for age. The sample was to be nationally representative. The EBRD’s preferred procedure was a two stage sampling method, with census enumeration areas (CEA) as primary sampling units and households as secondary sampling units. To the extent possible, the EBRD wished the sampling procedure to apply no more than 2 stages.
The first stage of selection was to use as a sampling frame the list of CEA's generated by the most recent census. Ideally, 50 primary sampling units (PSU's) were to be selected from that sample frame, with probability proportional to size (PPS), using as a measure of size either the population, or the number of households.
The second sampling stage was to select households within each of the primary sampling units, using as a sampling frame a specially developed list of all households in each of the selected PSU's defined above. Households to be interviewed were to be selected from that list by systematic, equal probability sampling. Twenty households were to be selected in each of the 50 PSU's.
The individuals to be interviewed in each household were to be selected at random, within each of the selected households, with no substitution if possible.
ESTABLISHMENT OF THE SAMPLE FRAME OF PSU’s
In each country we established the most recent sample frame of PSU’s which would best serve the purposes of the LITS sampling methodology. Details of the PSU sample frames in each country are shown in table 1 (page 10) of the survey report.
In the cases of Armenia, Azerbaijan, Kazakhstan, Serbia and Uzbekistan, CEA’s were used. In Croatia we also used CEA’s but in this case, because the CEA’s were very small and we would not have been able to complete the targeted number of interviews within each PSU, we merged together adjoining CEA’s and constructed a sample of 1,732 Merged Enumeration Areas. The same was the case in Montenegro.
In Estonia, Hungary, Lithuania, Poland and the Slovak Republic we used Eurostat’s NUTS area classification system.
[NOTE: The NUTS (from the French "Nomenclature des territoriales statistiques" or in English ("Nomenclature of territorial units for statistics"), is a uniform and consistent system that runs on five different NUTS levels and is widely used for EU surveys including the Eurobarometer (a comparable survey to the Life in Transition). As a hierarchical system, NUTS subdivides the territory of the country into a defined number of regions on NUTS 1 level (population 3-7 million), NUTS 2 level (800,000-3 million) and NUTS 3 level (150,000-800,000). At a more detailed level NUTS 3 is subdivided into smaller units (districts and municipalities). These are called "Local Administrative Units" (LAU). The LAU is further divided into upper LAU (LAU1 - formerly NUTS 4) and LAU 2 (formerly NUTS 5).]
Albania, Bulgaria, the Czech Republic, Georgia, Moldova and Romania used the electoral register as the basis for the PSU sample frame. In the other cases, the PSU sample frame was chosen using either local geographical or administrative and territorial classification systems. The total number of PSU sample frames per country varied from 182 in the case of Mongolia to over 48,000 in the case of Turkey. To ensure the safety of our fieldworkers, we excluded from the sample frame PSU’s territories (in countries such as Georgia, Azerbaijan, Moldova, Russia, etc) in which there was conflict and political instability. We have also excluded areas which were not easily accessible due to their terrain or were sparsely populated.
In the majority of cases, the source for this information was the national statistical body for the country in question, or the relevant central electoral committee. In establishing the sample frames and to the extent possible, we tried to maintain a uniform measure of size namely, the population aged 18 years and over which was of more pertinence to the LITS methodology. Where the PSU was based on CEA’s, the measure was usually the total population, whereas the electoral register provided data on the population aged 18 years old and above, the normal voting age in all sampled countries. Although the NUTS classification provided data on the total population, we filtered, where possible, the information and used as a measure of size the population aged 18 and above. The other classification systems used usually measure the total population of a country. However, in the case of Azerbaijan, which used CEA’s, and Slovenia, where a classification system based on administrative and territorial areas was employed, the measure of size was the number of households in each PSU.
The accuracy of the PSU information was dependent, to a large extent, on how recently the data has been collected. Where the data were collected recently then the information could be considered as relatively accurate. However, in some countries we believed that more recent information was available, but because the relevant authorities were not prepared to share this with us citing secrecy reasons, we had no alternative than to use less up to date data. In some countries the age of the data available makes the figures less certain. An obvious case in point is Bosnia and Herzegovina, where the latest available figures date back to 1991, before the Balkan wars. The population figures available take no account of the casualties suffered among the civilian population, resulting displacement and subsequent migration of people.
Equally there have been cases where countries have experienced economic migration in recent years, as in the case of those countries that acceded to the European Union in May, 2004, such as Hungary, Poland and the Baltic states, or to other countries within the region e.g. Armenians to Russia, Albanians to Greece and Italy; the available figures may not accurately reflect this. And, as most economic migrants tend to be men, the actual proportion of females in a population was, in many cases, higher than the available statistics would suggest. People migration in recent years has also occurred from rural to urban areas in Albania and the majority of the Asian Republics, as well as in Mongolia on a continuous basis but in this case, because of the nomadic population of the country.
In broad terms the following sampling methodology was employed:
· From the sample frame of PSU’s we selected 50 units
· Within each selected PSU, we sampled 20 households, resulting in 1,000 interviews per country
· Within each household we sampled 1 and sometimes 2 respondents
The sampling procedures were designed to leave no free choice to the interviewers. Details on each of the above steps as well as country specific procedures adapted to suit the availability, depth and quality of the PSU information and local operational issues are described in the following sections.
Selection of PSU’s
The PSU’s of each country (all in electronic format) were sorted first into metropolitan, urban and rural areas (in that order), and within each of these categories by region/oblast/province in alphabetical order. This ensured a consistent sorting methodology across all countries and also that the randomness of the selection process could be supervised.
To select the 50 PSU’s from the sample frame of PSU’s, we employed implicit stratification and sampling was done with PPS. Implicit stratification ensured that the sample of PSU’s was spread across the primary categories of explicit variables and a better representation of the population, without actually stratifying the PSU’s thus, avoiding difficulties in calculating the sampling errors at a later stage.
In brief, the PPS involved the following calculations:
· Cumulated size of the selected PSU (CEA, NUTS, etc)
· Scaled cumulated size based on the number of selected PSU’s (50) and the total size of the PSU’s (depending on country)
· Randomly shifted scaled cumulated size using a random number between 0-1
The selected PSU’s were those, where the integer part of the shifted scaled cumulated size changed.
Appendix A of the survey report (organised in country sections), shows the 50 PSU’s selected in each country, as well as where these were geographically located. As can be seen from the selected PSU’s in each country, the population in each PSU ranged from a few hundred people to several hundreds of thousands, especially in metropolitan and urban areas. In some large PSU’s (e.g. Tashkent in Uzbekistan, Almaaty in Kazakhstan, etc) the PPS had apportioned, more than 1 sampling area within the same PSU; this is because of the large population of those units.
Although we would have liked to have PSU’s of approximately equal size (preferably with population less than around 2,000 inhabitants), this was not feasible, because the PSU’s obtained from the various sources described in section 4.3.1, did not go down to that level of detail.
The PSU sampling methodology described in this section was implemented in 28 counties. The exception was Mongolia. In Mongolia, we had to adapt the PSU sampling process to account for the current availability and quality of the data, the very small population density, and the fact that between 30-50% (according to some estimates) of the population live nomadic lives both in urban and rural areas.
The normal stratification used in Mongolia for comparable surveys (like the Asiabarometer) and which methodology we followed also in this case, is to explicitly stratify the sample with the allocation of 19 PSU’s (38%) to the area (1st stratum) of the capital Ulaanbaatar (metropolitan) and the remaining 31 to other urban and rural areas (2nd stratum). We then used PPS selection of PSU’s within each stratum.
In a number of countries (Armenia, Bosnia and Herzegovina, Estonia, FYROM, Kyrgyz Republic, Lithuania, Romania, Russia, Tajikistan, Ukraine and Uzbekistan), a few (between 1 and 9) of the originally selected PSU’s, mostly in rural areas had to be replaced during the course of the fieldwork. The replaced PSU’s are given in Appendix A, under each country section. To the extent possible we tried to replace PSU’s by selecting other PSU’s matching the population and socioeconomic profile and proximity of the originally selected areas.
The most common reason for PSU replacement was because of geographical remoteness and consequent difficulties in accessing the area, especially given the poor road and transport infrastructure in many rural parts. There were also cases where PSU’s had low population densities which meant that distances between settlements were great, and where villages which were shown on maps, had subsequently been broken-up or been abandoned. Had we known before the PSU selection how difficult it was to access these PSU’s we would have excluded them from selection from the onset.
In some other cases, poor weather conditions and localised flooding exacerbated the problems and because of time limitations, we could not wait until the weather conditions improved to re-visit the PSU’s which were ultimately replaced.
PSU’s excluded from sampling
Certain territories of some countries (Albania, Azerbaijan, Kazakhstan, Mongolia, Moldova, Russia, Serbia, and Tajikistan) were excluded from the original sampling, either because there were conflicts in those areas or political instability, or because the selected areas were inaccessible. In Serbia’s case it was agreed before the start of the project that Kosovo will not be included in this survey.
Selection of dwellings within each chosen PSU
This part of the sampling process presented the most challenges because of the significant differences in the quality, depth, availability and size of PSU’s at this level and other pertinent data in each country. As can be seen from the selected PSU’s, some of the PSU’s were very large. Listing all eligible households and applying a single stage sampling within each PSU’s (or 2nd stage sampling as part of the overall process) was impracticable because of timescale and budget limitations. Listing all the households especially in large PSU’s (sometimes whole cities) would have meant census enumeration plus listings.
2nd stage sampling
In most of the countries it was necessary to apply more than two sampling stages to select households. These stages are described below.
The 2nd stage involved the selection of 4 segments/areas within each PSU, which would allow listing of dwellings and ultimately the sampling of households to be more practicable. For each selected PSU we obtained a hard copy map of the area and split this into small segments/zones. To the extent possible we aimed to have zones with equal populations although, as it turned out, this was not always feasible. Each segment was then given an identification number starting from from the north-east segment. As illustrated in the diagram below we numbered the segments from left to right ("reading a book" method) Segments which did not contain dwellings (such as parks and non-built up areas) were not numbered as above and were excluded from sampling.
The next step was to select 4 zones with the intention of conducting 5 household interviews in each (total of 20 per PSU). The selection of the zones was done using systematic, equal probability sampling.
Prior to fieldwork commencing, interviewers accompanied by fieldwork supervisors visited each selected segment/area and listed on paper all eligible dwellings (likely to be habited by households), including apartments in blocks of flats. Each eligible dwelling was assigned a unique serial number. It is important to note that during this exercise we were listing dwellings and not households as the latter would have taken a considerable time to do. Furthermore, we did not want to disturb some households twice (i.e., the fist time to find out how many households lived in a dwelling and the second time to interview, if selected). For the purposes of this research we assumed that dwellings were inhabited by one household. The same assumption was made for the apartments in blocks of flats.
Non-eligible dwellings such as hospitals, prisons, night clubs, offices etc, were not listed as these were excluded from the scope of the LITS. In the case of remote settlements, it was not always feasible to conduct this preparatory work because of the logistical difficulties involved. In such cases, we estimated the number of dwellings from the population and average size of the household in that area.
The 3rd sampling stage involved the selection of the eligible dwellings (assuming 1 household in each) within each of the selected areas. The nominal number of dwellings was 5. However, before proceeding with the sampling process each country estimated - based on previous experience - the number of household contacts needed to complete 5 interviews by taking into account the usual refusal rate and the likelihood of no interviews for reasons such as not finding anybody at home, or no reply. The number of additional dwellings varied between 3 and 4 depending on the country and the PSU.
The total number of dwellings (5 plus 3-4 possible replacements), were selected from the lists prepared by the fieldworkers during the listing exercise using systematic, equal probability sampling. From the number of selected dwellings (5+replacements) we again applied systematic,
equal probability sampling ("4th stage") but in this case the purpose was to "isolate" those which were replacements. The interviewers were provided with the contact details of the 5 selected dwellings (primary targets) and were told that they should exhaust all possible efforts to conduct interviews with the households of those dwellings only. The interviewers were not told about the reserve dwellings, the existence of which, and the possibility of using them was only known to fieldwork managers and senior supervisors.
Our aim whilst developing and implementing the sampling methodology was to ensure that the sampling procedures left no free choice to the interviewers. In those cases where more than one household resided in the same dwelling we interviewed the household which first opened the door. We made 3 attempts to interview the selected households before proceeding to the replacement households.
Additional sampling stages
In some cases and once the 4 areas were selected (as discussed in the previous section) it was necessary to apply additional sampling stages. This could have occurred when the field team visited the area for the purpose of listing all the dwellings in that area and discovered that because of the large number of dwellings it would have been impracticable to list all of them. In such cases the originally selected area (the four described in the previous section) were further divided into smaller segments. Numbering and selection of the smaller segments was done using the same procedures as those discussed in section 220.127.116.11 of the survey report.
Country sampling stages
In the majority of countries, the sampling process involved 3 stages, the 1st for PSU, the 2nd for areas with PSU’s and the 3rd for dwellings within areas. In Azerbaijan, Bulgaria, Serbia, Montenegro, and Estonia, we applied two stages of sampling. In Azerbaijan and Bulgaria we had information on the number of dwellings in each PSU and we did the selection using systematic, equal probability sampling. In Serbia, Montenegro and Estonia although information on the number of dwellings within each PSU’s was available, the holders of this information refused to share it with us. In these countries, selection of the dwellings was done by the statistical institutes using systematic equal probability sampling and a list was provided to us. In Hungary and Russia and for some PSU’s (not all) it was necessary to apply more than 3 stages (as explained in section 18.104.22.168.1 of the survey report).
Selection of household respondents
In each household we sampled sometimes one and sometimes two respondents. The first respondent was always the head of the household or other knowledgeable member, being the person(s) deemed to have the most knowledge on household issues (roster and expenses). The second person who was sampled was the person aged 18 years and over, who last had a birthday in the household.
Where the head of the household did not know the precise date of birth of adult members, or the list of birthdays was incomplete we used the Kish grid method to select the "principal" respondent. There were cases where the head of the household and the principal respondent was the same person. This would happen if the head of the household also had been the person to last have a birthday. There could never be more than two respondents per household. The head of the household was responsible for answering Sections 1 and 2 of the questionnaire (household roster and expenses) and the principal respondent Sections 3 -7 (life in transition).
As mentioned earlier we made 3 attempts to interview eligible households. As table 2 in the survey report shows, most interviews were successfully completed on the first visit. In total 29,002 successful interviews were completed; 1,000 per country, except in the Slovak Republic and Slovenia where an additional interview was conducted in each country.
On average, 79% of the interviews were completed on the first visit, 16% on the second and 6% on the third. Interviews were successfully completed on a first visit in rural as opposed to urban areas, with people especially in capital cities often being absent or returning home late from work. In addition, in some societies, such as the Balkans and the Asian Republics, high initial success rates can be attributed to the structure of local societies where several generations of a family live in the same house – there is always somebody home.
Those occasions where interviews were completed on 2nd and 3rd attempts were because either the household head or the principal respondent was absent during the previous visits. Reasons for not being at home include the fact that because of the harvest time, some respondents were still in the fields until late at night (rural) or still at work (urban).
Another issue that caused more than one interviewer visit, was because fieldwork was conducted during the Muslim Holy month of Ramadan, and respondents in Muslim countries were not available during certain times (breaking fast). Also the hours that Muslim interviewers could work were also curtailed.
Dates of collection
Mode of data collection
The first version of the LITS questionnaire was piloted (with a sufficient diverse respondent profile – household size, locality age, gender, etc) so as to adapt , if necessary, questions to make them more appropriate to local context, ensure that respondents understood the questions, identify problems in the instrument as well as estimate the length of interviews.
On average the pilot interviews took 74 minutes to complete (min=48, max=113, S.D=12). Following consultations with the EBRD the length of the questionnaire was reduced to approximately 45 minutes, but, as will be explained later in this report many respondents took longer to finish it.
As a result of the findings from the pilots, feedback from the countries during the workshops, and the two teleconferences, as well as feedback from the EBRD and our experience with comparable surveys, some questions and concepts were further developed / refined. These included:
· The amount of personal details we could ask respondents to provide us
· Which members should be included in the household roster
· Appropriate methods for sampling household respondents
· Definitions related to self-employment, work for an employer, occupation and industry of employment, etc
The definition as to who should be included in the household roster was tightened to exclude members of the household who were likely to be away from home on a permanent basis, such as students and working husbands (mainly in the Baltic States). This was to prevent a higher incidence of no interviews.
For the purposes of the LITS the definition of a household was “the people that live together in this dwelling pool their money and have meals in common on a regular basis”. Our interviewers were instructed to read the above definition to the head of each household as well as to ask them to exclude from the household roster persons who were away from home on a permanent basis (for work or studies).
Due to the prevailing political or social conditions in some countries it was necessary to adapt some questions/concepts. These changes which were agreed with the EBRD are described in the remainder of this section.
The standard introduction to be read to respondents prior to the interview made reference to the former Soviet Union and the transition period. As Turkey was not part of the Soviet bloc, it was necessary to change the introduction read to Turkish respondents. The question about membership of the Communist Party (Q.7.02) was not asked as this did not apply.
With forthcoming elections in November 2006 we did not ask Q.7.04 (attend lawful demonstrations, participate in strikes, join a political part, sign petitions) because this question may have been perceived as provocative/motivating/inciting people to do so.
Because of local sensitivities we did not ask Q7.02, (Communist Party membership), Q7.04 (attend lawful demonstrations, participate in strikes, join a political part, sign petitions), Q3.03 (trust in the presidency) and Q3.08 (on injustice as a cause of poverty).
Language of questionnaire
In some countries with substantial ethnic minorities we sometimes had to use questionnaires in two languages (local and one other). For example, in Azerbaijan, Georgia, Armenia the Baltic States and some other Asian Republics, we used local language questionnaires as well as in Russian, whilst in the former Yugoslav Republics we sometimes had to use the Albanian version.
Length of the questionnaire
Although the questionnaire was expected to take around 45 minutes to complete, feedback from the fieldworkers suggested that many people took longer to finish it. Interviews ranged from 40 minutes to well over one hour. Although younger respondents were more difficult to recruit, they tended to answer questions faster than older people or respondents with basic education who sometimes struggled to understand some of the questions and concepts and more explanations were needed. The length of interview for some respondents was regarded as too long who were normally showing signs of fatigue and lapses of concentration towards the end of the interview.
Issues and comments on the survey instrument
As a general comment, despite frequent re-assurances about confidentiality, some respondents appeared to be less convinced than others.
Generally the sensitive questions on household sources of income and unofficial payments were received with suspicion and mistrust by a number of respondents, and we believe that some of the answers given may not reflect reality. Conclusions from these types of questions should be treated with caution.
Section 1 (Household roster)
Some heads of household could not provide exact dates of birth, or respondents took time to remember all the birthdays of household members. In such cases, other family members would interfere with the interview to provide the missing information. Some people felt uncomfortable supplying their names and addresses, given that before commencing the interview they were told, that their responses were meant to be confidential. Respondents were also concerned about the general issue of personal data protection. We suspect that in some cases, there was a tendency for head of households to understate the actual number of household members in cases where communal utility charges (mostly in apartment blocks) were based on the number of people living in the household.
Section 2 (Housing expenses)
Housing and ownership
The results to the questions about housing and ownership of dwellings (Q.2.01-Q.2.04) need to be treated with caution because of the likelihood of different interpretations about the meaning of questions by some respondents and our interviewers. On Q.2.01 – type of dwelling-. It is possible that some interviewers may not have had the same understanding of the type of dwelling as people in more developed countries. In some particularly poor areas of certain countries, improvised housing units may have been classified as detached houses, (which in a sense they are), but obviously their construction and structure are not to the same standards found in developed countries. Some owners of recently built apartments and houses did not yet have title deeds to their property because of time-consuming and bureaucratic local registration procedures so they found it difficult to answer some of the questions. In some countries, dwellings could be built on somebody else’s land. In these cases, ownership is difficult to ascertain, because the building belongs to one person (who pays rent) and the land to a different person.
We also suspect mistrust about the property questions because some people appeared to be uncomfortable to disclose information regarding their property rights, especially if this was obtained not obtained100% legally.
Responses to the questions on water, heating and other utilities (Q2.05 and Q2.06) also need to be regarded with care. Although households may not have access to pipeline tap water, or have frequent cuts, some respondents commented that they use other sources of supply such as water stored in roof top tanks, collected from streams, or even bought from water tankers which visit their neighbourhoods on a regular basis. Equally, people may not have public central heating, but are not necessarily going cold, because they use stand-alone central heating systems, electrical heaters, coal, firewood, and other means to heat their homes.
Some respondents experienced problems in calculating household expenditure on food, clothing, transport and communication, and other goods and services for the past 30 days and year (Q.2.07 and Q.2.08) and had to consult with other family members (usually the partner or spouse) to get accurate estimates. In the analysis of the results, the seasonality of the expenses (for this survey the data were for the summer reason) may need to be taken into account. Regarding health expenses and for the avoidance of doubt, we advised respondents to exclude the contributions deducted automatically from their salaries.
As concerns annual expenses, some respondents mentioned that the cost of firewood used for heating and cooking was a significant expense.
Sources of income
Respondents were wary about answering Q.2.10, and may have been reporting only officially declared sources of income and were reluctant to disclose livelihoods received from other sources, especially unofficial. This reluctance, in many cases, can be associated with the suspicion and distrust which was shown to interviewers by respondents who believed they were working for the government, tax authorities, or other official agencies. This suspicion was underpinned by the fact that they were asked to provide their name and address to the interviewer, despite being told that the survey was confidential.
One factor that needs to be understood with regard to the answers to Q.2.11, Q.2.12 by some respondents is the fact that their perceptions about the past are coloured by their own situation. Therefore, in comparing their household now to 1989, they were looking back to a time when they were younger, healthier, single and living with their parents, not retired, etc. In analysing the results these personal issues may need to be taken into consideration, because some respondents would perceive that their lives had got worse over the intervening period, but this may just have been due to the ageing process, and not necessarily indicative that conditions during transition had deteriorated. Some respondents commented that overall, conditions today are better than 17 years ago, only if one is working. For the unemployed the situation is much worse. In some cases, respondents were perhaps answering Q.2.11 from an aspirational perspective i.e. where the household would like to be as opposed to the actual situation. There were also cases, where we felt that respondents felt embarrassed to give an honest answer, especially if their household was at the bottom of ladder.
Making ends meet
We think that in some cases respondents were answering Q.2.15 with an ideal salary in mind, whilst in other cases, thinking about their actual salary.
Section 3 (Attitudes and values)
Whilst some respondents answered this section easily and promptly, for others there was a great deal of mistrust and suspicion surrounding the questions in this section. A number of people regarded the questions as personal and confidential, and in some cases seemed to give evasive answers. And there were cases in some countries where respondents became angry and impatient with such questions, because they were tired of politics and economics. For them despite years of talk about such issues there have been no tangible improvements in their own lives. Some of the questions in Q.3.01 touched upon respondents’ pride (“how well have they done in life”). Therefore, they may have been inclined to answer that they had done better in life than their parents or classmates, even if that may not have been the reality. Responses to the question as to whether there is less corruption now than in 1989 (Q3.01) need to be interpreted carefully, as some respondents mentioned that pre-1989 corruption took the form of various favours done for individuals or groups, whilst today it has been replaced by monetary corruption.
On trust in institutions (Q.3.03), some people either professed ignorance of these matters or tried to avoid answering such questions. In Belarus, for example, as well as in some of the Asian Republics, some people were afraid about expressing opinions on such matters and were concerned that the interviewer might be trying to provoke them into expressing views that differed from the official line.
In some countries, respondents appeared to be uncomfortable with the questions about unofficial payments (Q.3.13, Q.3.14, and Q.3.15).
Some older people and those living in rural areas struggled to understand some of the questions and indicated that they had little direct contact with some of the institutions mentioned. In some cases, respondents appeared to give more “politically” correct answers than honest and truthful opinions.
People who live in urban areas showed more interest in politics and institutions than those who live in the countryside. Respondents in rural areas often did not care what political system or who was running the country because this had no significant influence on their lives. Younger respondents had problems comparing life today and in 1989, and often had to rely on hearsay and the memories of other family members.
Section 4 (Current activities)
Perhaps the biggest issue with this section was the recording of occupation and industry (Q.4.05 and Q.4.06) because many respondents had difficulties in classifying themselves against the definitions in the show cards. The process of collecting this information was as follows. We asked respondents to tell us, in their own words, their occupation and the industry in which they worked. We then showed them the occupation and industry show cards and ask them to select those categories which they though best fit their jobs. If the respondents had difficulties with the cards, the interviewers offered advice and guidance on which were the most likely categories The actual method of collecting the employment information (occupation and industry) was discussed with the EBRD during the development of the questionnaire. Whilst both parties agreed that the best option was to record qualitative information and code this post-survey (coding to be done by one person,) it was also agreed that this was not a practicable solution because of timing and budgetary constraints. As a matter of fact, collecting such detailed employment information and the controls needed to verify the data, constitute a separate survey on its own right.
Respondents with a lower level of education sometimes could not understand, without the help of the interviewer, the question regarding changes in the ownership of enterprises. There may have also been confusion among farmers who sometimes classified themselves as self employed.
Section 5 (Education and labour)
Although this section did not cause many problems, some respondents were unsure about the educational history and occupation (in terms of “principal job”), of their parents (Q5.03 and Q5.05). 22.214.171.124 Section 6 (Life history) For most respondents this section took the longest to complete and at this stage, they started showing signs of fatigue and lack of concentration.
As a general comment on the options of Not Applicable (code 19), it should be mentioned that questions were asked and if these were not applicable, respondents indicated so. Not applicable should be interpreted that an event did not take place during the intervening period (for example, did not get married, or did not have a child) or does not apply, such as women doing military service. On the other hand, the event may have happened, for example got married but if this was before 1989, the answer is still Not Applicable (code 19).
Some respondents were embarrassed talking about their previous or current jobs or their life history if their partner (wife or husband) was present, as these questions touched upon issues that they regarded as sensitive and personal and not necessarily known by their partner.
Important events and employment history
Although, Q.6.01 was meant to be a memory jogger to get respondents to remember the dates of their employment and other events it seems that this question has not fully served its purpose, because it was still taking respondents considerable time (for those with many jobs) to remember what they had done for a living and where they had worked since 1989 (Q.6.02).
Life in transition
There were cases where even wealthier respondents had chosen to cut down on basic food consumption (Q6.05), in order to be able to save for fashionable consumer goods, such as a new car, which are seen as a sign of social status. And there were cases where parents had sought monetary help from their children, or remittances from offspring working abroad, but did not regard this as turning to relatives for financial assistance, but a family obligation. Relatives for some respondents were regarded as distant relatives, not children or brothers and sisters.
Section 7 (Final questions)
Because of the political nature (Q.7.01, Q.7.02, Q.7.03 and Q.7.04), a number of respondents were suspicious and hesitant to answer these questions. In particular, people were wary about the question regarding membership of the Communist Party membership (Q.7.02), especially if they had been former members themselves or their family. In places with large ethnic minority communities, questions about nationality and religion resulted in reluctance to answer. People either did not want to discuss these issues or regarded such questions as intrusive. In other cases, the answers provided were what they thought the
interviewer wanted to hear, as opposed to their real feelings on these subjects.
In response to Q.7.06 – what is your religion? – Some respondents based their answers on family background rather than personal belief.
Section 8 (Conduct of interview)
This section was self-completed by the interviewers.
European Bank for Reconstruction and Development
Disclaimer and copyrights
The user of the data acknowledges that the original collector of the data, the authorized distributor of the data, and the relevant funding agency bear no responsibility for use of the data or for interpretations or inferences based upon such uses.