Impact of Grok AI on Business

Elon Musk’s X.ai is launching its own LLM called Grok that will have the currently unique feature of having access to the live Twitter / X data stream. This means it will be able to answer questions about near real time events, thoughts, controversies, ideas and, tediously, celebrity news too no doubt. So what will be the impact of Grok AI on business?

It’s far too early to know at the moment but as a first attempt I asked ChatGPT the following question.

“Suppose a new LLM had access to up to the second information via the Twitter platform, what would be the implications for business and how might one utilise the tool?”

<STARTS>

If a new Language Model (LLM) had access to up-to-the-second information via Twitter / X, it could have significant implications for businesses in various ways, particularly in the areas of market intelligence, customer service, and real-time decision-making. Here’s how businesses might utilize such a tool:

Market Intelligence and Trends Analysis:

Real-time Monitoring: Companies could monitor real-time public opinion and reaction to events, products, or services, allowing them to adjust marketing strategies promptly.

Trend Spotting: The LLM could identify emerging trends by analyzing the velocity and sentiment of tweets on particular topics, providing businesses with early insights into consumer interests and market shifts.

Customer Service and Engagement:

Instant Response: By integrating the LLM with their customer service platforms, businesses could provide instant responses to customer queries on Twitter, improving customer experience.

Sentiment Analysis: The LLM could assess customer sentiment in real-time, allowing businesses to address complaints or issues before they escalate.

Personalization: By analyzing tweets, businesses could tailor their communication to individual customers, enhancing engagement and loyalty.

Crisis Management:

Issue Detection: The LLM could quickly identify negative trends or potential crises related to the company, enabling faster response and mitigation efforts.

Communication Assistance: It could help draft real-time responses and communication strategies to manage public perception during a crisis.

Competitive Analysis:

  • Competitor Monitoring: Businesses could use the LLM to keep track of competitors’ activities, announcements, and customer feedback, providing insights into their strengths and weaknesses.
  • Benchmarking: It could help in analyzing how competitors’ events or product launches are being received by the market.

Content Creation and Curation:

  • Content Ideas: The tool could generate content ideas that are currently trending or likely to resonate with the audience.
  • Automated Posting: Businesses could use it to create and post Twitter content in real-time, reacting to ongoing discussions or events.

Financial Markets:

  • Market Sentiment: In financial services, the LLM could gauge market sentiment on financial products, stocks, or cryptocurrencies, potentially influencing trading decisions.
  • Event-Driven Trading: Traders might use insights from Twitter to inform event-driven trading strategies.

Product Development:

  • Feedback Analysis: Real-time analysis of feedback on products could inform product development and improvement.
  • Idea Generation: The tool could identify gaps in the market or consumer pain points that could lead to new product ideas.

Risk Management:

  • Reputation Monitoring: Constant monitoring of the brand’s perception on Twitter could help in assessing and managing reputational risks.
  • Compliance and Governance: The LLM could detect discussions related to regulatory compliance, potentially identifying risks or required actions.

Strategic Planning:

  • Data-Driven Decisions: Insights gleaned from Twitter data could be incorporated into strategic business decisions, making them more responsive to current events and trends.

Advertising and Promotions:

Targeted Campaigns: Analysis of Twitter could help in creating targeted ad campaigns that align with current discussions and user interests.

Influencer Identification: It could identify key influencers and opinion leaders in real time, providing opportunities for partnerships or endorsements.

Challenges and Considerations:

  • Data Privacy: Businesses must navigate privacy concerns and adhere to regulations regarding the use of public data from social platforms.
  • Information Overload: The sheer volume of data might require sophisticated filtering to avoid analysis paralysis.
  • Misinformation: The tool would need to be capable of distinguishing between credible information and rumors or fake news.
  • Contextual Understanding: Tweets are often nuanced and filled with slang, humor, and sarcasm, requiring advanced understanding to interpret correctly.

Overall, a real-time LLM with Twitter access would be a powerful tool for businesses, provided it is used responsibly and in compliance with data protection laws and ethical guidelines. The key would be integrating the insights into the decision-making process in a meaningful way that enhances, rather than overwhelms, business intelligence functions.

<ENDS>

Impact of Grok AI on Business

The impact of Grok AI on business is likely to be profound both directly and indirectly. Accessing real time data will be increasingly important as the competitive environment becomes all the more intense and nimble businesses that respond to events fastest could gain competitive advantage. Consider a situation where an influencer tweets about your product or service. You could automatically respond with a tweet offering a special one-day only offer to capitalise on the publicity. Clearly a social media manager could do the same but most businesses don’t have the resources to employ a dedicated social media manager.

Indirectly, X.ai giving Grok AI access to the twitter feed will put considerably pressure on other LLM suppliers like OpenAI, Google, Microsoft and others to make up to date information available to their LLMs too. However, at the moment, those suppliers don’t have access to anything like the up to date information fire hose that Twitter / X has and it seems unlikely that Elon Musk and X.ai will want to release it in a hurry either. Why would they?

Personal LLMs

Personal LLMs: have a discussion with yourself but a yourself with a much better memory

Personal LLMs (Large Language Models) will be ubiquitous within two years and it seems likely that companies like Apple and Google will develop mobile LLM apps in parallel with focusing hardware and software developments around AI. A personal LLM will let you converse with your own private data, such as your Notion or Roam database, your Google Docs, your journal, internal company databases, your photo albums and data generated by health apps and other systems . You will be able to talk to your documents and interrogate them using Natural Language Understanding or NLU.

Apart from obvious privacy benefits – your data will remain on your device – how might this be useful? What are the implications for how we deal with information both in the work environment and at home? Will it also lead to the “death of search”, the gradual migration away from search engines to LLMs by people looking for information. The “what” may still be served better by search engines, but the “how” is often better performed already by LLMs and as LLMs get access to the web. Questions that have complicated answer are already answered in a much better style by LLMs although the accuracy of information contained in those answer can be poor.

Let’s look at these questions.

How local LLMs will be useful

You will be able to talk to your device and ask questions of all your personal data as if having a dialogue with yourself but a yourself with a better memory.

“When did I go to Rome and what was the name of the hotel I stayed in?” “Show me any photos I took on that trip that include pictures of my wife. “

“When did I last go to the dentist?”.

“How many times have I been to Oxford this year”?

“What was that article about lowering sea levels by flooding parts of the desert called?”.

“Have I ever written notes about Gustav Holst?” etc.

The fluidity of streamlined knowledge retrieval is an important benefit in its own right – it will speed up thought, stop the retrieval process from interrupting ideation and reduce time spent on information administration and housekeeping. But with all gains there will be losses. If we stop needing to retrieve information from the deep recesses in our memories will be lose the ability as neural pathways atrophy? In a simple sense, you could say that the evidence is that it will. Afterall, most people who could once do mental arithmetic well are now worse at it because they use calculators instead. Why shouldn’t the process of information retrieval go the same way?

Personalised, local LLMs will act as personal assistants or PAs. When coupled with specific plugins you will be able to get them to book tickets or flights, write articles while you sleep based on overnight headlines and twitter feeds, suggest recipes and order ingredients missing from your fridge.

The death of search

Search engines like Google and Bing are good for atomised searching – looking for something specific and well-defined. They have always been poor at providing answers to more nuanced questions such as “why does temperature fall as your altitude increases?” Instead they list sources of answers that you often have to view one after another to get the answer you need. With an LLM there is a single step: you ask the question and you get the answer. That is not the case with search engines. So people are already migrating away from search towards more helpful tools like Claude, Perplexity and ChatGPT for anything but the most basic of searches. This of course will cut into the advertising income generated by search ads. Google in particular, which has a degree of financial diarrhoea due to heavy ingestion of ad cash, may start to suffer. As someone pointed out a while back, if Bing search dies, Microsoft still thrives. If Google search dies, so does Google. So far, no Search engine has successfully worked out how to splatter LLM outputs with ads and perhaps, with a bit of luck, no-one will.