ESOMAR has developed 36 questions to "provide a standard set of questions a buyer can ask to determine whether a sample provider’s practices and samples fit with their research objectives."
Our answers are provided below or you can download them as a file here.Download ESOMAR 36 PDF
ESOMAR 36 ANSWERS
GMO Research, Inc.
1. What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?
GMO Research Institute Inc. was established in April 2002 with the goal of collecting, researching, and disseminating information on P2P technology. In September 2006, GMO Research Inc. merged with GMO Research Institute Inc. After the merger, the company name was changed to GMO Research Inc. In 2009, Japan Market Intelligence (JMI) becomes a wholly owned subsidiary of GMO Research, Inc. In December 2012, both companies merged into GMO Research Inc. and continues to grow.
2. Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff?
Our tech team is monitoring the speed of response for each monitor and the accuracy of the feasibility of the results. Also, delivery and collection controls are system-controlled and automated, providing homogeneous sampling quality that eliminates human-dependent knowledge and experience.
3. What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?
In addition to online sampling, GMOR also conducts online recruiting, Home Use Test, Programming, and Hosting. We also provide data analysis services combined with a variety of data. (Engagement Lab)
Sample sources and Recruitment
4. Using the broad classifications above, from what sources of online sample do you derive participants?
GMO Research has been able to meet the various research needs of our clients by leveraging GMO Internet Group's long-standing ability to reach out to general consumers and build effective proprietary panels, infoQ and Z.com Research, for research in multiple countries through advertising with banners in media of our group, as well as through registration on our SNS. Thanks to this, we have made it possible to meet the various research needs of our clients.
In addition, by leveraging our experience in operating proprietary panels, we have been able to network with various partner panels. At present, Asia Cloud Panel has built a high-quality panel of more than 55 million panelists, all of whom have permission to cooperate in surveys.
By adopting a panel mix approach, we are able to conduct sampling that eliminates bias that could be present if all the panelists would come from one source to media characteristics.
5. Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer? (Assume proprietary to mean that the sample provider owns the asset. Assume exclusive to mean that the sample provider has an exclusive agreement to manage/provide access to sample originally collected by another entity.)
GMO Research provides more than 90% of the samples from our "Cloud Panel," a panel that we manage ourselves. Other samples are provided in cooperation with partner sample providers on a case-by-case basis.
6. What recruitment channels are you using for each of the sources you have described? Is the recruitment process’ open to all’ or by invitation only? Are you using probabilistic methods?
Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?
As for our own panels, anyone who is a resident of the countries we serve can participate by registering. We use both affiliate and referral programs as recruitment channels, and "natural search inflow" and "refer-a-friend" account for more than half of our registered users.
7. What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are?
Describe this both in terms of the practical steps you take within your own organisation and the technologies you are using. Please try to be as specific and quantify as much as you can.
Some of our own panels require SMS or phone number verification by cell phone for membership registration. Registered members are checked periodically, and if they are found to be fraudulent, their membership rights are revoked. We also conduct quality check surveys once a month to ensure that we do not send out surveys to members who respond inappropriately.
8. What brand (domain) and/or app are you using with proprietary sources? Summarise, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.
We do not currently use apps, but use PWAs to provide an environment that is easy to use for smartphone users. The percentage of smartphone users varies by country, but in Japan, the percentage of smartphone users accessing infoQ is about 40%, while in China it is 53% and in India it is 80%.
9. Which model(s) do you offer to deliver sample?
Managed service, self-serve, or API integration?
All models are offered.
10. If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?
The samples we provide as Cloud Panel are merged into a single panel without any duplication on our part. If you use other third parties, the client can be identified. Integration with third party sources is done using cookies on the cover of the survey screen when the survey is answered.
11. Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop-only questionnaires? Is it suitable to recruit for communities? For online focus groups?
Our 55 million members in Asia are managed by unique IDs, because of this we can directly target and distribute surveys to them, which makes us very good at conducting multiple surveys to the same members.
Sampling And Project Management
12. Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?
We can randomly distribute questionnaires in quotas according to the client's requirements. In some countries, such as Japan, where there is a sufficient number of people to be distributed, the numbers will be relatively close to online population ratio. If requested by the client, we can provide more control in some countries where we have a sufficient number of members, quota controls would be as varied and would depend on the needs of each project. We also conduct annual profiling surveys to target demographics such as B-to-B, lifestyle, and automotive.
13. What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?
The data held by 100% of our members are based on three attributes: gender, age, and area of residence. The rest of the data is obtained through regular attribute acquisition surveys. These attributes can also be used for survey targeting, and can be added to survey responses for delivery.
14. What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?
To give the most accurate feedback on feasibility, the following information is necessary: sample size, deadline, basic attributes (gender, age, region), and detailed attributes (product usages, etc).
15. What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?
If a project cannot be completed using our Cloud Panel, we will request permission to our client for using a third party partner. We will use a partner that we have confirmed beforehand that they can work with our system without any problems and keeps a desired level of quality standards.
16. Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.
Utilizing proprietary routing technology, members are given the right to respond to surveys based on their demographics.
17. Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?
By our routing technology, members are routed only 1 time so that a participant does not have a poor experience.
18. What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?
For regular projects we don't provide any information that could cause response bias, when the contents of the project are sensitive we put a warning on the cover of the questionnaire and ask them to agree in advance before participating in the study.
19. Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?
Generic survey names, the number of questions, and the number of points to be earned are clearly indicated. If any special notes are required, they will be clearly indicated on the cover page of the survey.
20. What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?
As default on a given study rewards cannot be different depending on the participants.
21. Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?
For certain panels, we measure respondents satisfaction with each questionnaire so that participants can check their scores in advance. We also conduct monthly satisfaction surveys of respondents to determine their level of satisfaction, including NPS indicators.
22. Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?
No special reports are provided.
Data quality and Validation
23. How often can the same individual participate in a survey?
How does this vary across your sample sources?
What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?
There is no particular limit to the number of times each member can complete the survey.
24. What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc.? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?
Registration data holds information such as the time and date of survey response and the last response date. This data is used for internal panel management and is not usually provided to customers.
25. Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.
We manage our survey panels without duplication in advance, and when distributing surveys, we manage and distribute them by member ID.
26. How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?
We randomly select participants from a panel that has been managed without duplication in advance, so we can collect questionnaire data with minimal bias. In the case of trackers, we propose the number of respondents to be collected by taking into account the dropout rate based on past data.
27. Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?
We conduct regular quality checks on the member IDs returned as bad data by our customers, and if the score of the survey does not meet our standards, we will not send any more surveys to that monitor.
28. For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item non-response (e.g.,“ Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?
For the surveys we host, we clean them according to our own cleaning standards and deliver the data. We conduct the same quality check surveys explained in question 27 to control the monitors who may have answered untruthfully or fraudulently. Respondents with extremely short response times are automatically output with their IDs each day and are periodically checked for possible anomalies.
Policies And Compliance
It clearly states how personal information will be handled and by what company. and how it is handled. We also disclose our contact information so that you can contact us at any time.
30. How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data?
How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?
We have obtained and are in compliance with the P-Mark, a Japanese certification for personal information protection, and the ISMS, a global certification for data security. We have also appointed a data protection officer to ensure that we comply with GDPR, CCPA, and other area-specific laws and regulations.
31. How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants?
In your response, please address the sample sources you wholly own, as well as those owned by other parties to whom you provide access.
When personal data is handled in a questionnaire, we disclose the information of the company handling the data and clearly state the contact information, and ask only those members who have agreed to respond to the questionnaire. For inquiries regarding personal information, we have prepared an inquiry form for our own member organizations, and for other panel sources, we exchange contact information with our management partners to ensure a smooth response.
32. How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?
We comply with the laws and regulations of each country by checking with our partners in each country.
33. What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?
We do not send out surveys directly to members of the age group designated as children in each country. If we are commissioned to send out a survey to children, we make sure to contact the parents of the children so that they can answer the survey together with their children.
34. Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how.
We have systems in place to ensure that only a limited number of members have direct access to personal information. We do not handle any personal information in normal operation.
35. What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?
We have already obtained the Privacy Mark, which is a certification for personal information protection in Japan. However, as we are accelerating our global expansion, we have decided that we need a more global certification and we got ISMS certification in 2020. We are currently operating in compliance with both frameworks, along with our internal audit process.
36. Do you certify to or comply with a quality framework such as ISO 20252?
same as above
37. Which of the following are you able to provide to buyers, in aggregate and by country and source? Please include a link or attach a file of a sample report for each of the metrics you use.
01. Average qualifying or completion rate, trended by month
02. Percent of paid completes rejected per month/project, trended by month
03. Percent of members/accounts removed/quarantined, trended by month
04. Percent of paid completes from 0-3 months tenure, trended by month
This data is not tracked.
05. Percent of paid completes from smartphones, trended by month
06. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
07. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
08. Average number of paid completes per member, trended by month (potentially by cohort)
09. Active unique participants in the last 30 days
Unique participants (Japan, Feb 2023) : 1,893,358
10. Active unique 18-24 male participants in the last 30 days
Unique 18-24 male participants (Japan, Feb 2023): 61,149
11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview
Maximum Feasibility (Japan, Dec 2022): 544,887
12. Percent of quotas that reached full quota at time of delivery, trended by month