Summarizing Long Emails and Other Text

Big ML Models

Another great use of Big ML models like GPT-3 is to summarize text by extracting the main points being made. Virtually all types of text can be summarized, including news articles, blog posts, research papers, novels, and poems.

Big ML | CREATING SUMMARIES

Summarizing Long Emails and Other Text

Another great use of Big ML models like GPT-3 is to summarize text by extracting the main points being made. Virtually all types of text can be summarized, including news articles, blog posts, research papers, novels, and poems. The summary can then be used to provide an overview of the text, or to provide a concise summary for readers who do not have time to read it all.

If you’re just tuning in, check out the Table of Contents and the Introduction to using OPT-175B.

Creating summaries

There are a couple techniques we can use to generate summaries, depending on the length and format of the text. One of the first problems you’ll probably run into when doing this is the length limitations of the input. For example, if we try to summarize a too-long article with OPT-175B we get an error message like this one: Unfeasable length constraints: the minimum length (771) is larger than the maximum length (512).

robot reading newspaperTo work around length limitations, we can split the process into multiple pieces, and individually summarize those pieces first. And to improve quality, that output can be again summarized or rewritten until it has produced a satisfactory result.

As a demonstration, I’ll be creating a summary of a previous article posted here, ‘Cyber Security Questionnaires: “This Makes No Sense”’. Structurally speaking, the article has an introduction and three sections and so my first inclination is to break it into pieces like this:

  1. Introduction + Section 1
  2. Section 2
  3. Section 3

After doing so and running them through OPT-175B in turn I get the following outputs back out:

1. How often do you update your software? – Many software vendors are constantly updating their software for security fixes. Keeping up with these updates is important. If you do not keep up with these updates, then cyber criminals have more time to break into your systems and steal valuable information.

2. What is the Cybersecurity Self-Assessment Questionnaire? The Cybersecurity Self-Assessment Questionnaires (CSAT) is a voluntary, self-assessment tool designed to help small businesses evaluate their current security posture and identify areas where they may be vulnerable to an attack.

3. Insurers are required to complete a new questionnaire that asks about the insurer’s cybersecurity program.

So one more time, I input these results back in as a single prompt, asking OPT-175B to generate a summary and get back the final result:

1. How often do software vendors update their software for security fixes?

2. What is the Cyber Security Self-Assessment Questionnaire?

3. Insurers are required a new questionnaire that asks about the insurers’ cybersecurity program.

Prompt variations for different types of summaries

As you are experimenting with various prompts to summarize your text, try out some of these:

  • Extract a list of keywords from the text
  • Summarize the text for a second grader
  • Explain the text to a 5 year old
  • Give me an Executive Summary of the text

Each of these prompts will invoke different results from the model and are worth remembering when you are looking at summarizing texts. Personally I’ve found extracting a list of keywords to be one of the most helpful prompts for understanding the meaning of texts.

Comparing GPT-3 to OPT-175B – what is the difference?

Thus far we’ve only touched on GPT-3 and have been using OPT-175B through this free web service for our demos instead. Being free there are a couple limitations such as a short maximum response length, and the need to solve a CAPTCHA to submit a request. The biggest benefit of OPT-175B however is that you can host it yourself if you have capable hardware. There are several versions ranging from a small one that can run on a normal computer to the largest model requiring a cluster of computers with high-end graphics cards.

GPT-3 is not hostable in such way as Microsoft has purchased an exclusive license from OpenAI to use it hosted internally. OpenAI is only permitted to offer API access to the model through its paid service. But because it is pay per use, you can use it as much as you want without artificially low limits.

How does GPT-3 do at summarizing text?

In the current example used GPT-3 does an excellent job at summarizing the text. Much better, I think, than OPT-175B. In my case I gave it the full article, along with the prompt ‘Here is a summary of the text:’ and was surprised with this output:

The insurance carrier has sent out a cyber security questionnaire to fill out and the person filling it out is wondering if they need help from a cyber security specialist. The questions on the questionnaire can be difficult to understand, but most of them can be answered by either the IT provider or the business stakeholder. If the person filling out the questionnaire needs help, their best bet is to ask their insurance carrier or hire a cyber security specialist.

What do you think about this summary and how it compares to the one generated with OPT-175B?

Be more productive with AI-summarized text

a robot in a snowglobeThere are many ways that AIsummarized text can help improve productivity in the business world. For example, it can be used in customer service to help your CSR get to the core of a longwinded support request. It can also be used by a news agency to summarize press releases from the feed and help decide if they warrant a deeper look.

Additionally, it can help busy executives get through their emails faster by extracting the key points from each message. Finally, it can be used in project management to create summaries of progress reports and help the manager identify areas that need more attention.

How to use GPT3 to summarize text

To give GPT-3 a try, you’ll first need to open an account. When that is ready, head on over to the Playground. This works similarly to the hosted OPT-175B but with some additional controls. Here are a few that everyone should be aware of:

  • Model – This is the model to use. text-davinci-002 is the latest model at the time of this writing and comes at the highest cost. It is still incredibly low, but if you are generating large quantities of text it is still something you should be aware of.
  • Temperature – Just like in OPT-175B, the higher this number the more creative and random the AI will get.
  • Maximum length – the maximum number of tokens to use in generation. Tokens in this case are the smallest billing unit used by GPT-3, and currently cost $0.02 per 1,000 tokens for the Davinci model.

You can use the GPT-3 playground largely interchangeably with the OPT-175B web interface we’ve been using. And if you’re stuck on a prompt it’s a good idea to try the other one and see if it handles it any better.

Next up: Writing a Press Release in the GPT-3 Playground

< Back to the beginning

Get In Touch