Using Tavily search to get a single string answer

I’m trying to get Tavily to search information on the web regarding the part number of an item. Tavily successfully finds me the manufacturer, as it should, but the problem is in the “answer” it provides.

Currently, my query looks like this: “what is the name of the company that manufactures the item with part number X?”, but it yields, most of the time, an answer like “Y is the company that manufactures the item with the part number X”.

I’m looking for a way to get the answer to be just “Y”, without the extra words/phrase. Is this possible, or should I make amends on my end to “filter” the extra sentence?

Hi there,
Thank you for using Tavily!

The include_answer feature is designed to provide a concise answer to your original query based on the content of the URLs retrieved from the response. However, it often includes extra context or phrases from the source. If you’re looking for a more customized or specific answer, you may need to filter or adjust the response on your end.
Tavily’s Search API is primarily designed to give you context from the web to help transform vast amounts of unstructured web data into actionable insights, ultimately unlocking new opportunities for decision making.

Best,
May

1 Like

Thank you for the answer, May!

I managed to filter the results on my end by removing the contents after the response, but this is not really optimal. I can imagine the scenario where Tavily’s API replies with something on the beginning of the phrase too, and in this case how do you suggest we proceed, if we don’t know the contents that go in between the words Tavily adds?

To illustrate:

Query: “What is the manufacturer of product X?”

Possible answer A: “Y is the manufacturer of X.”
Possible answer B: “The manufacturer of X is Y.”

Do you see how this could impact an automated system’s code? Would I need to pass the Tavily search result to another GPT for completion, such as OpenAI’s?

Hi!

In cases like this, passing the results from Tavily’s API to another LLM for further completion or restructuring seems like the best solution. This approach can help ensure that the results fit the desired format, regardless of the phrasing used in the include_answer feature.

If you’d like to discuss this further, feel free to reach out!

Best,
May