article thumbnail

The Promise and Pitfalls of Chaining Large Language Models for Email

Tom Tunguz

Over the last few weeks I’ve been experimenting with chaining together large language models. Bad data from the transcription -> inaccurate prompt to the LLM -> incorrect output. Tn machine learning systems, achieving an 80% solution is pretty rapid. I dictate emails & blog posts often.

article thumbnail

Which Increases Productivity More : The Advent of Personal Computer or a Large-Language Model?

Tom Tunguz

” That’s the conclusion from OpenAI’s recent paper “ GPTs are GPTs: An Early Look at the Labor Market Impact Potential of Large Language Models. ” How much might US GDP grow assuming large-language models enable US workers to do more? The BEA estimates US GDP is $26.2t.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

How to Improve Your LLM : Combine Evaluations with Analytics

Tom Tunguz

The future of LLM evaluations resembles software testing more than benchmarks. Real-world testing looks like this , asking LLMs to produce Dad jokes like this zinger : I’m reading a book about gravity & it’s impossible to put down. LLMs are tricky. 1 can be greater than 4. This is called non-determinism.

article thumbnail

Context.ai - Unlocking Insight into LLM-Based Applications

Tom Tunguz

Large-language models have transformed how millions interact with products : from customer support to code generation to legal document analysis. These new engagement models invite users through a meaningfully different product journey. is a product analytics platform for LLM-powered applications. Context.ai

article thumbnail

How to Leverage AI for Actionable Insights in BI, Data, and Analytics

In the rapidly-evolving world of embedded analytics and business intelligence, one important question has emerged at the forefront: How can you leverage artificial intelligence (AI) to enhance your application’s analytics capabilities?

article thumbnail

What are large language models — and how are they used in generative AI?

IT World

When ChatGPT arrived in November 2022, it made mainstream the idea that generative artificial intelligence (AI) could be used by companies and consumers to automate tasks, help with creative ideas, and even code software. Want some ideas for a new marketing or ad campaign? Generative AI to the rescue.

article thumbnail

Machine Learning vs. Artificial Intelligence: What’s The Difference? 

SaaS Metrics

Have you ever mixed-up artificial intelligence and machine learning? Artificial intelligence—often called AI—is the big idea. Machine learning, or ML for short, is just one tool inside. Read more The post Machine Learning vs. ML helps computers. You’re not alone.

article thumbnail

Embedding BI: Architectural Considerations and Technical Requirements

While data platforms, artificial intelligence (AI), machine learning (ML), and programming platforms have evolved to leverage big data and streaming data, the front-end user experience has not kept up. Traditional Business Intelligence (BI) aren’t built for modern data platforms and don’t work on modern architectures.

article thumbnail

LLMOps for Your Data: Best Practices to Ensure Safety, Quality, and Cost

Speaker: Shreya Rajpal, Co-Founder and CEO at Guardrails AI & Travis Addair, Co-Founder and CTO at Predibase

Large Language Models (LLMs) such as ChatGPT offer unprecedented potential for complex enterprise applications. However, productionizing LLMs comes with a unique set of challenges such as model brittleness, total cost of ownership, data governance and privacy, and the need for consistent, accurate outputs.

article thumbnail

LLMs in Production: Tooling, Process, and Team Structure

Speaker: Dr. Greg Loughnane and Chris Alexiuk

Greg Loughnane and Chris Alexiuk in this exciting webinar to learn all about: How to design and implement production-ready systems with guardrails, active monitoring of key evaluation metrics beyond latency and token count, managing prompts, and understanding the process for continuous improvement Best practices for setting up the proper mix of open- (..)

article thumbnail

A Tale of Two Case Studies: Using LLMs in Production

Speaker: Tony Karrer, Ryan Barker, Grant Wiles, Zach Asman, & Mark Pace

Join our exclusive webinar with top industry visionaries, where we'll explore the latest innovations in Artificial Intelligence and the incredible potential of LLMs. We'll walk through two compelling case studies that showcase how AI is reimagining industries and revolutionizing the way we interact with technology.

article thumbnail

How Banks Are Winning with AI and Automated Machine Learning

By leveraging the power of automated machine learning, banks have the potential to make data-driven decisions for products, services, and operations. Read the whitepaper, How Banks Are Winning with AI and Automated Machine Learning, to find out more about how banks are tackling their biggest data science challenges.

article thumbnail

How Banks Are Winning with AI and Automated Machine Learning

By leveraging the power of automated machine learning, banks have the potential to make data-driven decisions for products, services, and operations. Read the white paper, How Banks Are Winning with AI and Automated Machine Learning, to find out more about how banks are tackling their biggest data science challenges.

article thumbnail

Generative AI Deep Dive: Advancing from Proof of Concept to Production

Speaker: Maher Hanafi, VP of Engineering at Betterworks & Tony Karrer, CTO at Aggregage

Executive leaders and board members are pushing their teams to adopt Generative AI to gain a competitive edge, save money, and otherwise take advantage of the promise of this new era of artificial intelligence.

article thumbnail

Resilient Machine Learning with MLOps

Today’s economy is under pressure from inflation, rising interest rates, and disruptions in the global supply chain. As a result, many organizations are seeking new ways to overcome challenges — to be agile and rapidly respond to constant change. We do not know what the future holds.