File-search is ignores the "Max num results"

Hi
i am have a json file that serves as the data-store and have fed it to the assistant. when running the assistant with the default setting (gpt-4o with default 20 num result) it is consuming around 18000 tokens.

however, when I reduce it to only 10 results it is still consuming 18,000 tokens.

note in both cases it is small amount of text, below is a snippet from the input and output:

User:

i want articles about nanocomposites

Response:

Here are the articles related to nanocomposites that I found:

  1. Nanocomposites for Prosthetic Dental Technology: A Systemic Review
  • URL:
  1. The Effect of Nano Hydroxyapatite on the Bond Strength of Acrylic Teeth Repaired to an Acrylic Resin Denture Base Using Two Types of Auto Polymerized Acrylic Resins
  • URL:
  1. Evaluation of Flexure Strength of Heat Cure Acrylic Resin Reinforcement with Nano Al2O3 After Polishing with Different Abrasive Materials
  • URL:

For additional information or further assistance, please feel free to ask!

[1][2][3]

Sounds like a bug to me, making a post here will most likely not get it fixed. I’ve create a post on how to report bugs to openai with step by step instructions:

https://community.openai.com/t/how-to-properly-report-a-bug-to-openai

in my experience, this has been the fastest way to get something fixed.

thx a ton. let me report it.

is there a way to change all these settings programmatically using python sdk?

check this out https://platform.openai.com/docs/assistants/tools/file-search/ensuring-vector-store-readiness-before-creating-runs

the sdk doesn’t recognize these parameters:
[chunking_strategy] and [file_search.max_num_results]

Please check the library properly. Here is the code sample from OpenAI library for chunking strategy static. They have the definition present in the sdk. Also, make sure you have the updated version of the library as they had recently committed changes.

class ChunkingStrategyStaticStatic(TypedDict, total=False):
    chunk_overlap_tokens: Required[int]
    """The number of tokens that overlap between chunks. The default value is `400`.

    Note that the overlap must not exceed half of `max_chunk_size_tokens`.
    """

    max_chunk_size_tokens: Required[int]
    """The maximum number of tokens in each chunk.

    The default value is `800`. The minimum value is `100` and the maximum value is
    `4096`.
    """

(code-credit: OpenAI library)