Spaces:
Running
on
CPU Upgrade
This error appears all the time.
๐: What is transhumanism? until the 70s
๐ค: Step 1
๐ค: Error in generating model output:
litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - Error code: 400 - {'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 349303 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
๐ค: Step 1 | Duration: 12.41
๐ค: -----
๐ค: Step 2
๐ค: Error in generating model output:
litellm.ContextWindowExceededError: litellm.BadRequestError: ContextWindowExceededError: OpenAIException - Error code: 400 - {'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 349432 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
๐ค: Step 2 | Duration: 4.55
๐ค: -----