Market research – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Fri, 19 Mar 2021 14:12:45 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/cropped-DC-logo-emblem_multicolor-32x32.png Market research – Dataconomy https://dataconomy.ru 32 32 NewtonX Knowledge Graph launches, using AI to identify 1.1 billion niche experts https://dataconomy.ru/2021/03/18/newtonx-knowledge-graph-launches-ai-1-1-billion-experts/ https://dataconomy.ru/2021/03/18/newtonx-knowledge-graph-launches-ai-1-1-billion-experts/#respond Thu, 18 Mar 2021 13:00:00 +0000 https://dataconomy.ru/?p=21852 Companies often find it hard to find and engage with high-quality, high-caliber professionals for market research and assist their businesses. AI and machine learning may hold the key to solving this problem. Today, NewtonX, a B2B research company, has launched its NewtonX Knowledge Graph, aiming to solve this exact problem.  NewtonX claims it is the only […]]]>

Companies often find it hard to find and engage with high-quality, high-caliber professionals for market research and assist their businesses. AI and machine learning may hold the key to solving this problem.

Today, NewtonX, a B2B research company, has launched its NewtonX Knowledge Graph, aiming to solve this exact problem. 

NewtonX claims it is the only company in the market that can guarantee professionals have been ID, background, and subject matter expertise-checked. The experts found within its solution help businesses make critical decisions regarding their products, services, pricing, or marketing innovations, understanding the customer’s voice, and providing competitive benchmarks. 

“When my co-founder Sascha Eder and I were working together at McKinsey, we were frustrated by the quality, speed, and cost involved in the process of sourcing B2B insights,” Germain Chastel, co-founder and CEO at NewtonX, told me. “We had a very large and formative client in the tech space that wanted subject-matter experts to weigh in on areas such as virtual reality, AI, and big data. However, we weren’t getting the results we wanted.”

That’s where artificial intelligence and machine learning came into play.

“We needed reliable data from highly specific populations, but we had no way of accessing this data at scale,” Chastel said. “Most B2B market research relies primarily on consumer-oriented panels where fraud can be rampant. Vetting for experience and verifying people are who they say they are, becomes nearly impossible to do at scale. It was an opportunity for AI and automation.”

How does it work, and how did NewtonX manage to create such an extensive network of experts?

“Rather than pull from a closed pool of respondents in a traditional panel, we built the NewtonX Graph, our proprietary technology that sources from an open network of 1.1 billion B2B professionals across over 140 industries,” Chastel said. “The Graph uses a series of API integrations that interface with private databases from NewtonX partners (recruiting firms, professional associations, trade associations, conference organizers, etc.) as well as data providers and search engines such as Dow Jones Factiva, Bing, Google, LinkedIn, and Xing, among others to scan for professionals that match customer criteria. This is how we deliver the largest possible reach without compromising any nuance.”

The use cases for NewtonX tend to be in quantitative surveys and qualitative consultations, which are sometimes combined to provide its clients with the information they need to make crucial business decisions.

“We see a lot of interest from large enterprise organizations in our NewtonX Q3 Formula, which is a mix of qualitative and quantitative methodologies applied to the same population,” Chastel said. “It’s simple but effective: we start with 1:1 interviews and use those qualitative in-depth insights to inform a survey to refine the initial findings at scale and quantify them. Then we take the survey results and follow up with individual respondents who provided interesting insights to discuss further. Since every professional that works with NewtonX goes through a 2-step ID check, we can reconnect with them to drive deeper learnings.”

Pricing for NewtonX doesn’t follow the traditional market research model. Instead, the startup is taking an alternate approach.

“Unlike our competitors, there is no subscription required to work with us, Chastel said. “We have two pricing models: a pay-per-project fee and a retainer fee. Clients who work with us often mostly choose a retainer structure since it extends a discount. Pricing varies for each project depending on the scope but is primarily based on five key factors: Audience target description (e.g., CEOs vs. VPs), the number of respondents needed, the market and geography, length of interview and survey, and quotas or screener questions.”

NewtonX Knowledge Graph is the latest launch from the New York-based startup, which has raised $15 million in funding over 3 rounds.

“In addition to introducing the NewtonX Knowledge Graph to the market, we just launched a new brand identity that better expresses who we are and what we do,” Chastel said. “It emphasizes the human-centered approach we take to cutting-edge technology to help our clients find knowledgeable, accurate answers to their most pressing questions. Internally, the next generation of our platform rolls out this spring which will reduce fielding time and allow clients to self-serve more easily.”

]]>
https://dataconomy.ru/2021/03/18/newtonx-knowledge-graph-launches-ai-1-1-billion-experts/feed/ 0
Data Mining for Social Intelligence – Opinion data as a monetizable resource https://dataconomy.ru/2017/05/12/data-mining-opinion-data/ https://dataconomy.ru/2017/05/12/data-mining-opinion-data/#comments Fri, 12 May 2017 09:00:26 +0000 https://dataconomy.ru/?p=17904 The digital age is characterised increasingly by the collective. The information generated by tapping into the minds of many is driving decisions in both the public and private sector; research is becoming social. On the back of this, a new science has emerged – known as opinion mining – which uses the latest advances in […]]]>

The digital age is characterised increasingly by the collective. The information generated by tapping into the minds of many is driving decisions in both the public and private sector; research is becoming social.

On the back of this, a new science has emerged – known as opinion mining – which uses the latest advances in artificial intelligence (AI) to mine public opinion for sentiment. This structured data is known as opinion data. By analysing online public opinion, governments and global organisations can now access a set of insights that can shape strategy and better measure the public’s experience of their policies, service and brands.

Crucially, the field of opinion mining looks not only at sentiment, but the topics driving that sentiment. This has also opened up new research capabilities in the world of market and political research. The sheer quantity of online conversations (Facebook alone had 1.15 billion mobile daily active users on average for December 2016), coupled with the instantaneous reactivity of this digital chatter, has meant that sentiment-driven opinion data has become a mineable, monetisable resource.

Approaches to opinion mining

There are two main approaches to opinion mining. The first exclusively uses AI to structure the data. The second adds an extra layer of analysis by processing the data through a crowd – a team of people that verify the data for sentiment and the topics driving the sentiment. The crowd’s role is to help correctly classify the unstructured data, as pure AI based approaches struggle with the nuances of human conversation. This problem is especially common in social media where conversations are typically filled with humour, slang, innuendo, sarcasm, colloquialisms and emoticons.

For example, a tweet from someone saying “I just spent 5 hours in the queue at bank X… best customer service ever!” is clearly sarcastic, but to a machine will likely be seen as positive.

By combining AI and human understanding you get the best of both worlds – the ability to gather and process huge data sets and still gain an accurate understanding of the publics’ feelings.

The changing of the guard

Historically, such data was used primarily to guide commercial decision-making – largely by providing companies with deeper insight into consumer experience.

A much wider ambit is possible, however. While part remains firmly within the commercial realm, an increasing proportion lies in the governmental/state/city arena. In the latter, social analytics are proving useful in both deepening the understanding of electorates and driving strategic political decisions.

New markets and growth prospects

Agility, reactivity and leanness – these are among the buzzwords for businesses operating within the current landscape of changing customer relationships, increasing competition and the threat of disruptive innovation. Moreover, with growth having stalled in much of the developed world, many multinationals are looking for data that can help them move into new untested markets.

With the global adoption of mobile phones and the concurrent explosion in the use of social media, companies now have access to millions of relevant data points in both mature and developing markets. The data, if mined correctly, can provide insights that not only assist in decision making, but can shape product innovation and help reduce risk when moving to new markets.

Research and consulting houses: differentiating the value proposition

The tough economic climate combined with a whole new set of disruptive technologies has created both risk and opportunity for consultancies and research houses. Clients are coming to them for help in understanding today’s consumer behaviour to drive innovation, but many of their traditional offerings and research methodologies are not fit for purpose. They are too slow, rely on small sample sizes, are expensive and come with biases that can lead to big errors. The recent failure of pollsters to predict Brexit and Trump’s victory are pertinent examples of this.

Given this backdrop, attracting and retaining clients by offering insights based on opinion data coupled with their specialist expertise may be a crucial differentiator for the sector going forward.

For example, a major application of opinion mining is its use in understanding the drivers underlying the purchasing behaviour of the market. Opinion data can allow researchers to predict what people are going to buy and understand the factors driving that behaviour.

A tweet in which a consumer expresses dissatisfaction with a new model of car or phone, for example, may not shed that much light in isolation. However, if many people do the same – and the drivers behind their dissatisfaction can be analysed, pinpointed and quantified – the resulting data can be used to improve subsequent product lines, or understand why competitors’ offerings may be outperforming.

Media: personalised and on-demand

The future of the media landscape will be one characterised by an increasing focus on personalised content and on-demand viewing.

From an opinion mining perspective, helpfully, many viewers are both vocal and honest about their tastes, often turning to social media to express their feelings: be it about a pilot programme, a new channel or a major happening in a much-loved series.

By delving into this information, and the drivers behind it, media companies can gain access to the type of granular data needed to tailor their programming strategies – both by channel and by market – in order to deliver viewers the content they want in the form they want to consume it.

Mainstream news media is also struggling to understand their audiences and their feelings towards today’s stories. Fake news and “news feed bubbles” are creating significant pockets of public opinion that are not based on truth, undermining the legitimacy (and demand for) their journalism. Opinion mining offers a way for media to accurately measure and report on public opinion, regardless of its veracity.

Insight into the electorate

Governments and other state bodies are another group of organisations likely to turn to social media analysis in the search for greater clarity. 2017 will see a series of major political events: March witnessed the triggering of Article 50, April the announcement of a snap UK election, while May ushered in new era for French politics with Emmanuel Macron chosen as the French President. Later in the year, Germany will vote. The Chinese Communist Party’s five-yearly congress is scheduled for later this year and further interest rate hikes by the Fed are also due.

As states begin to push for a greater degree of foresight around the potential outcome of such key events, the insights hidden within the voice of the collective will likely be sought out increasingly through opinion mining.

Future of opinion mining

A wealth of unstructured opinion data – across social media and elsewhere – was, up until this point, a largely inaccessible resource. However, by tapping into the text analysis powers of artificial intelligence, and by overlaying it with the refinement of crowd-based analysis, the world of opinion mining is set to change the way governments and business conduct research, make decisions and measure the impact of these decisions.

 

Like this article? Subscribe to our weekly newsletter to never miss out!

Image: Alberto G, CC BY 2.0

]]>
https://dataconomy.ru/2017/05/12/data-mining-opinion-data/feed/ 1