Replies: 1 comment 1 reply
-
已解决!transformers的版本要固定在4.39.2即可! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
4月2号下载的版本运行和交互都ok;
但是今天下载最新的版本,可以拉起来,但是交互就报错:
#1 运行python openai_api_demo/api_server.py也可以:
Loading checkpoint shards: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 7/7 [00:07<00:00, 1.11s/it]
INFO: Started server process [18624]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
#2 就是用postAPI发送“你好”的消息(json格式和4月份的一样),就报错:
2024-08-18 19:01:44.036 | DEBUG | main:create_chat_completion:242 - ==== request ====
{'messages': [ChatMessage(role='system', content="YYou are ChatGLM3, a large language model trained by Zhipu.AI. Follow the user's instructions carefully. Respond using markdown.", name=None, function_call=None), ChatMessage(role='user', content='你好', name=None, function_call=None)], 'temperature': 0.8, 'top_p': 0.8, 'max_tokens': 100, 'echo': False, 'stream': False, 'repetition_penalty': 1.1, 'tools': None}
INFO: 10.11.227.242:57295 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/home/coolpadadmin/anaconda3/envs/ydq/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/anaconda3/envs/ydq/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/anaconda3/envs/ydq/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/middleware/cors.py", line 85, in call
await self.app(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 65, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/routing.py", line 756, in call
await self.middleware_stack(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/routing.py", line 776, in app
await route.handle(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/routing.py", line 297, in handle
await self.app(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/routing.py", line 77, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/starlette/routing.py", line 72, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/anaconda3/envs/ydq/lib/python3.12/site-packages/fastapi/routing.py", line 278, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/anaconda3/envs/ydq/lib/python3.12/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/work/coolai_test/llm/llm_glm3-6b/ChatGLM3/openai_api_demo/api_server.py", line 298, in create_chat_completion
response = generate_chatglm3(model, tokenizer, gen_params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/work/coolai_test/llm/llm_glm3-6b/ChatGLM3/openai_api_demo/utils.py", line 165, in generate_chatglm3
for response in generate_stream_chatglm3(model, tokenizer, params):
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 36, in generator_context
response = gen.send(None)
^^^^^^^^^^^^^^
File "/home/coolpadadmin/work/coolai_test/llm/llm_glm3-6b/ChatGLM3/openai_api_demo/utils.py", line 81, in generate_stream_chatglm3
for total_ids in model.stream_generate(**inputs, eos_token_id=eos_token_id, **gen_kwargs):
File "/home/coolpadadmin/.local/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 36, in generator_context
response = gen.send(None)
^^^^^^^^^^^^^^
File "/home/coolpadadmin/.cache/huggingface/modules/transformers_modules/chatglm3-6b/modeling_chatglm.py", line 1138, in stream_generate
logits_processor = self._get_logits_processor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/coolpadadmin/anaconda3/envs/ydq/lib/python3.12/site-packages/transformers/generation/utils.py", line 866, in _get_logits_processor
and generation_config._eos_token_tensor is not None
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'GenerationConfig' object has no attribute '_eos_token_tensor'
Beta Was this translation helpful? Give feedback.
All reactions