Error running the example as it is

#15
by sabonzo - opened

I always get the following error :
╰─ HfApiModel - https://wxknx1kg971u7k1n.us-east-1.aws.endpoints.huggingface.c─╯
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Step 1 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Error in generating model output:
(Request ID: HLUsuE)

Bad request:
Bad Request: Invalid state
[Step 8: Duration 0.04 seconds]
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 715, in process_events
response = await route_utils.call_process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 2088, in process_api
result = await self.call_function(
File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1647, in call_function
prediction = await utils.async_iteration(iterator)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 728, in async_iteration
return await anext(iterator)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 722, in anext
return await anyio.to_thread.run_sync(
File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 962, in run
result = context.run(func, *args)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 705, in run_sync_iterator_async
return next(iterator)
File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 866, in gen_wrapper
response = next(iterator)
File "/home/user/app/Gradio_UI.py", line 197, in interact_with_agent
for msg in stream_to_gradio(self.agent, task=prompt, reset_agent_memory=False):
File "/home/user/app/Gradio_UI.py", line 145, in stream_to_gradio
total_input_tokens += agent.model.last_input_token_count
TypeError: unsupported operand type(s) for +=: 'int' and 'NoneType'

So am I doing it wrongly or missing something?

I had the same problem.
In Gradio_UI.py I changed row 144, like below:
if hasattr(agent.model, "last_input_token_count") and agent.model.last_input_token_count is not None:

And then I faced next problem, HF token is needed for HfApiModel, like:

import os
model = HfApiModel(
...
token=os.getenv('hf_token')
)
Define your secret hf_token in Settings of your space.

Sign up or log in to comment