li-qing commited on
Commit
517b6c2
1 Parent(s): 682d6db

feat: update

Browse files
gradio_web_server.log CHANGED
@@ -439,3 +439,566 @@
439
  2024-07-10 07:24:48 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
440
  2024-07-10 07:24:48 | INFO | stdout |
441
  2024-07-10 07:24:48 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
439
  2024-07-10 07:24:48 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
440
  2024-07-10 07:24:48 | INFO | stdout |
441
  2024-07-10 07:24:48 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
442
+ 2024-07-10 07:26:20 | INFO | gradio_web_server | bot_response. ip: 46.3.240.105
443
+ 2024-07-10 07:26:20 | INFO | gradio_web_server | monitor error: HTTPConnectionPool(host='localhost', port=9090): Max retries exceeded with url: /is_limit_reached?model=llava-fire&user_id=46.3.240.105 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb91047d1b0>: Failed to establish a new connection: [Errno 111] Connection refused'))
444
+ 2024-07-10 07:26:20 | INFO | gradio_web_server | model_name: llava-fire;model_api_dict: None
445
+ 2024-07-10 07:26:20 | INFO | gradio_web_server | bot_response. ip: 46.3.240.105
446
+ 2024-07-10 07:26:20 | INFO | gradio_web_server | monitor error: HTTPConnectionPool(host='localhost', port=9090): Max retries exceeded with url: /is_limit_reached?model=llava-original&user_id=46.3.240.105 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb91047d870>: Failed to establish a new connection: [Errno 111] Connection refused'))
447
+ 2024-07-10 07:26:20 | INFO | gradio_web_server | model_name: llava-original;model_api_dict: None
448
+ 2024-07-10 07:26:28 | INFO | stdout | torch.Size([1, 58]) torch.Size([1, 5, 3, 336, 336])
449
+ 2024-07-10 07:26:29 | ERROR | stderr | /usr/local/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:392: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.
450
+ 2024-07-10 07:26:29 | ERROR | stderr | warnings.warn(
451
+ 2024-07-10 07:26:29 | ERROR | stderr | /usr/local/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:397: UserWarning: `do_sample` is set to `False`. However, `top_p` is set to `0.9` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`.
452
+ 2024-07-10 07:26:29 | ERROR | stderr | warnings.warn(
453
+ 2024-07-10 07:26:34 | INFO | stdout | ["The figure in the image is a can of Södra Almighy, which is a beer. Södra is a Swedish brewery known for its lager beers. The can's design is modern and minimalist, with a color scheme that includes black and white, and the text is in English. The label indicates that it is a dry-hopped beer, which means it has been flavored with hops that have not been steeped in the brewing process, giving it a unique taste profile. The can's design suggests a contemporary and possibly craft beer, which is often associated with a more complex flavor profile than traditional lagers."]
454
+ 2024-07-10 07:27:01 | INFO | gradio_web_server | bot_response. ip: 46.3.240.106
455
+ 2024-07-10 07:27:01 | INFO | gradio_web_server | monitor error: HTTPConnectionPool(host='localhost', port=9090): Max retries exceeded with url: /is_limit_reached?model=llava-fire&user_id=46.3.240.106 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb91047fd90>: Failed to establish a new connection: [Errno 111] Connection refused'))
456
+ 2024-07-10 07:27:01 | INFO | gradio_web_server | model_name: llava-fire;model_api_dict: None
457
+ 2024-07-10 07:27:01 | INFO | gradio_web_server | bot_response. ip: 46.3.240.106
458
+ 2024-07-10 07:27:01 | INFO | gradio_web_server | monitor error: HTTPConnectionPool(host='localhost', port=9090): Max retries exceeded with url: /is_limit_reached?model=llava-original&user_id=46.3.240.106 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fb91047ded0>: Failed to establish a new connection: [Errno 111] Connection refused'))
459
+ 2024-07-10 07:27:01 | INFO | gradio_web_server | model_name: llava-original;model_api_dict: None
460
+ 2024-07-10 07:27:14 | INFO | stdout | torch.Size([1, 58]) torch.Size([1, 5, 3, 336, 336])
461
+ 2024-07-10 07:27:15 | ERROR | stderr | /usr/local/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:392: UserWarning: `do_sample` is set to `False`. However, `temperature` is set to `0` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.
462
+ 2024-07-10 07:27:15 | ERROR | stderr | warnings.warn(
463
+ 2024-07-10 07:27:15 | ERROR | stderr | /usr/local/lib/python3.10/site-packages/transformers/generation/configuration_utils.py:397: UserWarning: `do_sample` is set to `False`. However, `top_p` is set to `0.9` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`.
464
+ 2024-07-10 07:27:15 | ERROR | stderr | warnings.warn(
465
+ 2024-07-10 07:27:20 | INFO | stdout | ["The figure in the image is a can of Södra Almighy, which is a beer. Södra is a Swedish brewery known for its lager beers. The can's design is modern and minimalist, with a color scheme that includes black and white, and the text is in English. The label indicates that it is a dry-hopped beer, which means it has been flavored with hops that have not been steeped in the brewing process, giving it a unique taste profile. The can's design suggests a contemporary and possibly craft beer, which is often associated with a more complex flavor profile than traditional lagers."]
466
+ 2024-07-10 07:27:20 | INFO | gradio_web_server | hello
467
+ 2024-07-10 07:27:20 | INFO | gradio_web_server | The figure in the image is a can of Södra Almighy, which is a beer. Södra is a Swedish brewery known for its lager beers. The can's design is modern and minimalist, with a color scheme that includes black and white, and the text is in English. The label indicates that it is a dry-hopped beer, which means it has been flavored with hops that have not been steeped in the brewing process, giving it a unique taste profile. The can's design suggests a contemporary and possibly craft beer, which is often associated with a more complex flavor profile than traditional lagers.
468
+ 2024-07-10 07:37:00 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
469
+ 2024-07-10 07:37:00 | INFO | stdout |
470
+ 2024-07-10 07:37:00 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
471
+ 2024-07-10 07:37:21 | ERROR | stderr | Traceback (most recent call last):
472
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
473
+ 2024-07-10 07:37:21 | ERROR | stderr | response = await route_utils.call_process_api(
474
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
475
+ 2024-07-10 07:37:21 | ERROR | stderr | output = await app.get_blocks().process_api(
476
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
477
+ 2024-07-10 07:37:21 | ERROR | stderr | result = await self.call_function(
478
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
479
+ 2024-07-10 07:37:21 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
480
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
481
+ 2024-07-10 07:37:21 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
482
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
483
+ 2024-07-10 07:37:21 | ERROR | stderr | return await future
484
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
485
+ 2024-07-10 07:37:21 | ERROR | stderr | result = context.run(func, *args)
486
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
487
+ 2024-07-10 07:37:21 | ERROR | stderr | response = f(*args, **kwargs)
488
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 168, in add_text
489
+ 2024-07-10 07:37:21 | ERROR | stderr | states[i] = State(model_selectors[i], is_vision=True)
490
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 115, in __init__
491
+ 2024-07-10 07:37:21 | ERROR | stderr | self.init_system_prompt(self.conv)
492
+ 2024-07-10 07:37:21 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 118, in init_system_prompt
493
+ 2024-07-10 07:37:21 | ERROR | stderr | system_prompt = conv.get_system_message()
494
+ 2024-07-10 07:37:21 | ERROR | stderr | AttributeError: 'Conversation' object has no attribute 'get_system_message'
495
+ 2024-07-10 07:37:22 | ERROR | stderr | Traceback (most recent call last):
496
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
497
+ 2024-07-10 07:37:22 | ERROR | stderr | response = await route_utils.call_process_api(
498
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
499
+ 2024-07-10 07:37:22 | ERROR | stderr | output = await app.get_blocks().process_api(
500
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
501
+ 2024-07-10 07:37:22 | ERROR | stderr | result = await self.call_function(
502
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
503
+ 2024-07-10 07:37:22 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
504
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
505
+ 2024-07-10 07:37:22 | ERROR | stderr | return await iterator.__anext__()
506
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
507
+ 2024-07-10 07:37:22 | ERROR | stderr | return await anyio.to_thread.run_sync(
508
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
509
+ 2024-07-10 07:37:22 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
510
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
511
+ 2024-07-10 07:37:22 | ERROR | stderr | return await future
512
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
513
+ 2024-07-10 07:37:22 | ERROR | stderr | result = context.run(func, *args)
514
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
515
+ 2024-07-10 07:37:22 | ERROR | stderr | return next(iterator)
516
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
517
+ 2024-07-10 07:37:22 | ERROR | stderr | response = next(iterator)
518
+ 2024-07-10 07:37:22 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
519
+ 2024-07-10 07:37:22 | ERROR | stderr | if state0.skip_next:
520
+ 2024-07-10 07:37:22 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
521
+ 2024-07-10 07:39:41 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
522
+ 2024-07-10 07:39:41 | INFO | stdout |
523
+ 2024-07-10 07:39:41 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
524
+ 2024-07-10 07:40:22 | ERROR | stderr | Traceback (most recent call last):
525
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
526
+ 2024-07-10 07:40:22 | ERROR | stderr | response = await route_utils.call_process_api(
527
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
528
+ 2024-07-10 07:40:22 | ERROR | stderr | output = await app.get_blocks().process_api(
529
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
530
+ 2024-07-10 07:40:22 | ERROR | stderr | result = await self.call_function(
531
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
532
+ 2024-07-10 07:40:22 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
533
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
534
+ 2024-07-10 07:40:22 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
535
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
536
+ 2024-07-10 07:40:22 | ERROR | stderr | return await future
537
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
538
+ 2024-07-10 07:40:22 | ERROR | stderr | result = context.run(func, *args)
539
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
540
+ 2024-07-10 07:40:22 | ERROR | stderr | response = f(*args, **kwargs)
541
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 168, in add_text
542
+ 2024-07-10 07:40:22 | ERROR | stderr | states[i] = State(model_selectors[i], is_vision=True)
543
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 115, in __init__
544
+ 2024-07-10 07:40:22 | ERROR | stderr | self.init_system_prompt(self.conv)
545
+ 2024-07-10 07:40:22 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 126, in init_system_prompt
546
+ 2024-07-10 07:40:22 | ERROR | stderr | system_prompt = system_prompt.replace("{{currentDateTime}}", current_date)
547
+ 2024-07-10 07:40:22 | ERROR | stderr | AttributeError: 'Conversation' object has no attribute 'set_system_message'
548
+ 2024-07-10 07:40:23 | ERROR | stderr | Traceback (most recent call last):
549
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
550
+ 2024-07-10 07:40:23 | ERROR | stderr | response = await route_utils.call_process_api(
551
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
552
+ 2024-07-10 07:40:23 | ERROR | stderr | output = await app.get_blocks().process_api(
553
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
554
+ 2024-07-10 07:40:23 | ERROR | stderr | result = await self.call_function(
555
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
556
+ 2024-07-10 07:40:23 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
557
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
558
+ 2024-07-10 07:40:23 | ERROR | stderr | return await iterator.__anext__()
559
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
560
+ 2024-07-10 07:40:23 | ERROR | stderr | return await anyio.to_thread.run_sync(
561
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
562
+ 2024-07-10 07:40:23 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
563
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
564
+ 2024-07-10 07:40:23 | ERROR | stderr | return await future
565
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
566
+ 2024-07-10 07:40:23 | ERROR | stderr | result = context.run(func, *args)
567
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
568
+ 2024-07-10 07:40:23 | ERROR | stderr | return next(iterator)
569
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
570
+ 2024-07-10 07:40:23 | ERROR | stderr | response = next(iterator)
571
+ 2024-07-10 07:40:23 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
572
+ 2024-07-10 07:40:23 | ERROR | stderr | if state0.skip_next:
573
+ 2024-07-10 07:40:23 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
574
+ 2024-07-10 07:40:40 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
575
+ 2024-07-10 07:40:40 | INFO | stdout |
576
+ 2024-07-10 07:40:40 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
577
+ 2024-07-10 07:40:54 | INFO | stdout | moderating image: /tmp/gradio/405613dcd3661394aad4b9b9addbd1743365fabf/screenshot-20240708-164613.png
578
+ 2024-07-10 07:40:54 | ERROR | stderr | Traceback (most recent call last):
579
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
580
+ 2024-07-10 07:40:54 | ERROR | stderr | response = await route_utils.call_process_api(
581
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
582
+ 2024-07-10 07:40:54 | ERROR | stderr | output = await app.get_blocks().process_api(
583
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
584
+ 2024-07-10 07:40:54 | ERROR | stderr | result = await self.call_function(
585
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
586
+ 2024-07-10 07:40:54 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
587
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
588
+ 2024-07-10 07:40:54 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
589
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
590
+ 2024-07-10 07:40:54 | ERROR | stderr | return await future
591
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
592
+ 2024-07-10 07:40:54 | ERROR | stderr | result = context.run(func, *args)
593
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
594
+ 2024-07-10 07:40:54 | ERROR | stderr | response = f(*args, **kwargs)
595
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 190, in add_text
596
+ 2024-07-10 07:40:54 | ERROR | stderr | text, image_flagged, csam_flag = moderate_input(
597
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision.py", line 148, in moderate_input
598
+ 2024-07-10 07:40:54 | ERROR | stderr | nsfw_flagged, csam_flagged = image_moderation_filter(images[0])
599
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/home/user/app/src/utils.py", line 496, in image_moderation_filter
600
+ 2024-07-10 07:40:54 | ERROR | stderr | nsfw_flagged = image_moderation_provider(image_bytes, "nsfw")
601
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/home/user/app/src/utils.py", line 478, in image_moderation_provider
602
+ 2024-07-10 07:40:54 | ERROR | stderr | endpoint = os.environ["AZURE_IMG_MODERATION_ENDPOINT"]
603
+ 2024-07-10 07:40:54 | ERROR | stderr | File "/usr/local/lib/python3.10/os.py", line 680, in __getitem__
604
+ 2024-07-10 07:40:54 | ERROR | stderr | raise KeyError(key) from None
605
+ 2024-07-10 07:40:54 | ERROR | stderr | KeyError: 'AZURE_IMG_MODERATION_ENDPOINT'
606
+ 2024-07-10 07:40:55 | ERROR | stderr | Traceback (most recent call last):
607
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
608
+ 2024-07-10 07:40:55 | ERROR | stderr | response = await route_utils.call_process_api(
609
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
610
+ 2024-07-10 07:40:55 | ERROR | stderr | output = await app.get_blocks().process_api(
611
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
612
+ 2024-07-10 07:40:55 | ERROR | stderr | result = await self.call_function(
613
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
614
+ 2024-07-10 07:40:55 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
615
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
616
+ 2024-07-10 07:40:55 | ERROR | stderr | return await iterator.__anext__()
617
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
618
+ 2024-07-10 07:40:55 | ERROR | stderr | return await anyio.to_thread.run_sync(
619
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
620
+ 2024-07-10 07:40:55 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
621
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
622
+ 2024-07-10 07:40:55 | ERROR | stderr | return await future
623
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
624
+ 2024-07-10 07:40:55 | ERROR | stderr | result = context.run(func, *args)
625
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
626
+ 2024-07-10 07:40:55 | ERROR | stderr | return next(iterator)
627
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
628
+ 2024-07-10 07:40:55 | ERROR | stderr | response = next(iterator)
629
+ 2024-07-10 07:40:55 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
630
+ 2024-07-10 07:40:55 | ERROR | stderr | if state0.skip_next:
631
+ 2024-07-10 07:40:55 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
632
+ 2024-07-10 07:43:45 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
633
+ 2024-07-10 07:43:45 | INFO | stdout |
634
+ 2024-07-10 07:43:45 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
635
+ 2024-07-10 07:44:15 | INFO | stdout | moderating image: /tmp/gradio/405613dcd3661394aad4b9b9addbd1743365fabf/screenshot-20240708-164613.png
636
+ 2024-07-10 07:44:15 | INFO | stdout | skip for now
637
+ 2024-07-10 07:44:15 | ERROR | stderr | Traceback (most recent call last):
638
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
639
+ 2024-07-10 07:44:15 | ERROR | stderr | response = await route_utils.call_process_api(
640
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
641
+ 2024-07-10 07:44:15 | ERROR | stderr | output = await app.get_blocks().process_api(
642
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
643
+ 2024-07-10 07:44:15 | ERROR | stderr | result = await self.call_function(
644
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
645
+ 2024-07-10 07:44:15 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
646
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
647
+ 2024-07-10 07:44:15 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
648
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
649
+ 2024-07-10 07:44:15 | ERROR | stderr | return await future
650
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
651
+ 2024-07-10 07:44:15 | ERROR | stderr | result = context.run(func, *args)
652
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
653
+ 2024-07-10 07:44:15 | ERROR | stderr | response = f(*args, **kwargs)
654
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 225, in add_text
655
+ 2024-07-10 07:44:15 | ERROR | stderr | post_processed_text = _prepare_text_with_image(
656
+ 2024-07-10 07:44:15 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 317, in _prepare_text_with_image
657
+ 2024-07-10 07:44:15 | ERROR | stderr | image = state.conv.convert_image_to_base64(
658
+ 2024-07-10 07:44:15 | ERROR | stderr | AttributeError: 'Conversation' object has no attribute 'convert_image_to_base64'
659
+ 2024-07-10 07:44:16 | ERROR | stderr | Traceback (most recent call last):
660
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
661
+ 2024-07-10 07:44:16 | ERROR | stderr | response = await route_utils.call_process_api(
662
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
663
+ 2024-07-10 07:44:16 | ERROR | stderr | output = await app.get_blocks().process_api(
664
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
665
+ 2024-07-10 07:44:16 | ERROR | stderr | result = await self.call_function(
666
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
667
+ 2024-07-10 07:44:16 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
668
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
669
+ 2024-07-10 07:44:16 | ERROR | stderr | return await iterator.__anext__()
670
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
671
+ 2024-07-10 07:44:16 | ERROR | stderr | return await anyio.to_thread.run_sync(
672
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
673
+ 2024-07-10 07:44:16 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
674
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
675
+ 2024-07-10 07:44:16 | ERROR | stderr | return await future
676
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
677
+ 2024-07-10 07:44:16 | ERROR | stderr | result = context.run(func, *args)
678
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
679
+ 2024-07-10 07:44:16 | ERROR | stderr | return next(iterator)
680
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
681
+ 2024-07-10 07:44:16 | ERROR | stderr | response = next(iterator)
682
+ 2024-07-10 07:44:16 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
683
+ 2024-07-10 07:44:16 | ERROR | stderr | if state0.skip_next:
684
+ 2024-07-10 07:44:16 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
685
+ 2024-07-10 07:48:59 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
686
+ 2024-07-10 07:48:59 | INFO | stdout |
687
+ 2024-07-10 07:48:59 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
688
+ 2024-07-10 07:49:17 | INFO | stdout | moderating image: /tmp/gradio/405613dcd3661394aad4b9b9addbd1743365fabf/screenshot-20240708-164613.png
689
+ 2024-07-10 07:49:17 | INFO | stdout | skip for now
690
+ 2024-07-10 07:49:17 | ERROR | stderr | Traceback (most recent call last):
691
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
692
+ 2024-07-10 07:49:17 | ERROR | stderr | response = await route_utils.call_process_api(
693
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
694
+ 2024-07-10 07:49:17 | ERROR | stderr | output = await app.get_blocks().process_api(
695
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
696
+ 2024-07-10 07:49:17 | ERROR | stderr | result = await self.call_function(
697
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
698
+ 2024-07-10 07:49:17 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
699
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
700
+ 2024-07-10 07:49:17 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
701
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
702
+ 2024-07-10 07:49:17 | ERROR | stderr | return await future
703
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
704
+ 2024-07-10 07:49:17 | ERROR | stderr | result = context.run(func, *args)
705
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
706
+ 2024-07-10 07:49:17 | ERROR | stderr | response = f(*args, **kwargs)
707
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 225, in add_text
708
+ 2024-07-10 07:49:17 | ERROR | stderr | post_processed_text = _prepare_text_with_image(
709
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 323, in _prepare_text_with_image
710
+ 2024-07-10 07:49:17 | ERROR | stderr | image = convert_image_to_base64(image, None)
711
+ 2024-07-10 07:49:17 | ERROR | stderr | File "/home/user/app/src/conversation.py", line 693, in convert_image_to_base64
712
+ 2024-07-10 07:49:17 | ERROR | stderr | from fastchat.utils import resize_image_and_return_image_in_bytes
713
+ 2024-07-10 07:49:17 | ERROR | stderr | ModuleNotFoundError: No module named 'fastchat'
714
+ 2024-07-10 07:49:19 | ERROR | stderr | Traceback (most recent call last):
715
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
716
+ 2024-07-10 07:49:19 | ERROR | stderr | response = await route_utils.call_process_api(
717
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
718
+ 2024-07-10 07:49:19 | ERROR | stderr | output = await app.get_blocks().process_api(
719
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
720
+ 2024-07-10 07:49:19 | ERROR | stderr | result = await self.call_function(
721
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
722
+ 2024-07-10 07:49:19 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
723
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
724
+ 2024-07-10 07:49:19 | ERROR | stderr | return await iterator.__anext__()
725
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
726
+ 2024-07-10 07:49:19 | ERROR | stderr | return await anyio.to_thread.run_sync(
727
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
728
+ 2024-07-10 07:49:19 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
729
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
730
+ 2024-07-10 07:49:19 | ERROR | stderr | return await future
731
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
732
+ 2024-07-10 07:49:19 | ERROR | stderr | result = context.run(func, *args)
733
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
734
+ 2024-07-10 07:49:19 | ERROR | stderr | return next(iterator)
735
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
736
+ 2024-07-10 07:49:19 | ERROR | stderr | response = next(iterator)
737
+ 2024-07-10 07:49:19 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
738
+ 2024-07-10 07:49:19 | ERROR | stderr | if state0.skip_next:
739
+ 2024-07-10 07:49:19 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
740
+ 2024-07-10 07:49:56 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
741
+ 2024-07-10 07:49:56 | INFO | stdout |
742
+ 2024-07-10 07:49:56 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
743
+ 2024-07-10 07:50:30 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
744
+ 2024-07-10 07:50:30 | INFO | stdout |
745
+ 2024-07-10 07:50:30 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
746
+ 2024-07-10 07:50:41 | INFO | stdout | moderating image: /tmp/gradio/405613dcd3661394aad4b9b9addbd1743365fabf/screenshot-20240708-164613.png
747
+ 2024-07-10 07:50:41 | INFO | stdout | skip for now
748
+ 2024-07-10 07:50:41 | ERROR | stderr | Traceback (most recent call last):
749
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
750
+ 2024-07-10 07:50:41 | ERROR | stderr | response = await route_utils.call_process_api(
751
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
752
+ 2024-07-10 07:50:41 | ERROR | stderr | output = await app.get_blocks().process_api(
753
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
754
+ 2024-07-10 07:50:41 | ERROR | stderr | result = await self.call_function(
755
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
756
+ 2024-07-10 07:50:41 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
757
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
758
+ 2024-07-10 07:50:41 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
759
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
760
+ 2024-07-10 07:50:41 | ERROR | stderr | return await future
761
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
762
+ 2024-07-10 07:50:41 | ERROR | stderr | result = context.run(func, *args)
763
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
764
+ 2024-07-10 07:50:41 | ERROR | stderr | response = f(*args, **kwargs)
765
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 234, in add_text
766
+ 2024-07-10 07:50:41 | ERROR | stderr | + [x.to_gradio_chatbot() for x in states]
767
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 234, in <listcomp>
768
+ 2024-07-10 07:50:41 | ERROR | stderr | + [x.to_gradio_chatbot() for x in states]
769
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 130, in to_gradio_chatbot
770
+ 2024-07-10 07:50:41 | ERROR | stderr | return self.conv.to_gradio_chatbot()
771
+ 2024-07-10 07:50:41 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/llava/conversation.py", line 206, in to_gradio_chatbot
772
+ 2024-07-10 07:50:41 | ERROR | stderr | msg, image, image_process_mode = msg
773
+ 2024-07-10 07:50:41 | ERROR | stderr | ValueError: not enough values to unpack (expected 3, got 2)
774
+ 2024-07-10 07:50:43 | ERROR | stderr | Traceback (most recent call last):
775
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
776
+ 2024-07-10 07:50:43 | ERROR | stderr | response = await route_utils.call_process_api(
777
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
778
+ 2024-07-10 07:50:43 | ERROR | stderr | output = await app.get_blocks().process_api(
779
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
780
+ 2024-07-10 07:50:43 | ERROR | stderr | result = await self.call_function(
781
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
782
+ 2024-07-10 07:50:43 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
783
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
784
+ 2024-07-10 07:50:43 | ERROR | stderr | return await iterator.__anext__()
785
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
786
+ 2024-07-10 07:50:43 | ERROR | stderr | return await anyio.to_thread.run_sync(
787
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
788
+ 2024-07-10 07:50:43 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
789
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
790
+ 2024-07-10 07:50:43 | ERROR | stderr | return await future
791
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
792
+ 2024-07-10 07:50:43 | ERROR | stderr | result = context.run(func, *args)
793
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
794
+ 2024-07-10 07:50:43 | ERROR | stderr | return next(iterator)
795
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
796
+ 2024-07-10 07:50:43 | ERROR | stderr | response = next(iterator)
797
+ 2024-07-10 07:50:43 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
798
+ 2024-07-10 07:50:43 | ERROR | stderr | if state0.skip_next:
799
+ 2024-07-10 07:50:43 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
800
+ 2024-07-10 07:55:50 | INFO | stdout | Running on local URL: http://0.0.0.0:7860
801
+ 2024-07-10 07:55:50 | INFO | stdout |
802
+ 2024-07-10 07:55:50 | INFO | stdout | To create a public link, set `share=True` in `launch()`.
803
+ 2024-07-10 07:56:32 | INFO | stdout | moderating image: /tmp/gradio/405613dcd3661394aad4b9b9addbd1743365fabf/screenshot-20240708-164613.png
804
+ 2024-07-10 07:56:33 | INFO | stdout | skip for now
805
+ 2024-07-10 07:56:33 | ERROR | stderr | Traceback (most recent call last):
806
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
807
+ 2024-07-10 07:56:33 | ERROR | stderr | response = await route_utils.call_process_api(
808
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
809
+ 2024-07-10 07:56:33 | ERROR | stderr | output = await app.get_blocks().process_api(
810
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
811
+ 2024-07-10 07:56:33 | ERROR | stderr | result = await self.call_function(
812
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1514, in call_function
813
+ 2024-07-10 07:56:33 | ERROR | stderr | prediction = await anyio.to_thread.run_sync(
814
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
815
+ 2024-07-10 07:56:33 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
816
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
817
+ 2024-07-10 07:56:33 | ERROR | stderr | return await future
818
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
819
+ 2024-07-10 07:56:33 | ERROR | stderr | result = context.run(func, *args)
820
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 833, in wrapper
821
+ 2024-07-10 07:56:33 | ERROR | stderr | response = f(*args, **kwargs)
822
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 235, in add_text
823
+ 2024-07-10 07:56:33 | ERROR | stderr | + [x.to_gradio_chatbot() for x in states]
824
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 235, in <listcomp>
825
+ 2024-07-10 07:56:33 | ERROR | stderr | + [x.to_gradio_chatbot() for x in states]
826
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/home/user/app/src/serve/gradio_web_server.py", line 130, in to_gradio_chatbot
827
+ 2024-07-10 07:56:33 | ERROR | stderr | return self.conv.to_gradio_chatbot()
828
+ 2024-07-10 07:56:33 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/llava/conversation.py", line 206, in to_gradio_chatbot
829
+ 2024-07-10 07:56:33 | ERROR | stderr | msg, image, image_process_mode = msg
830
+ 2024-07-10 07:56:33 | ERROR | stderr | ValueError: not enough values to unpack (expected 3, got 2)
831
+ 2024-07-10 07:56:34 | ERROR | stderr | Traceback (most recent call last):
832
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/queueing.py", line 541, in process_events
833
+ 2024-07-10 07:56:34 | ERROR | stderr | response = await route_utils.call_process_api(
834
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 276, in call_process_api
835
+ 2024-07-10 07:56:34 | ERROR | stderr | output = await app.get_blocks().process_api(
836
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1928, in process_api
837
+ 2024-07-10 07:56:34 | ERROR | stderr | result = await self.call_function(
838
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1526, in call_function
839
+ 2024-07-10 07:56:34 | ERROR | stderr | prediction = await utils.async_iteration(iterator)
840
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 657, in async_iteration
841
+ 2024-07-10 07:56:34 | ERROR | stderr | return await iterator.__anext__()
842
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 650, in __anext__
843
+ 2024-07-10 07:56:34 | ERROR | stderr | return await anyio.to_thread.run_sync(
844
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
845
+ 2024-07-10 07:56:34 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
846
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
847
+ 2024-07-10 07:56:34 | ERROR | stderr | return await future
848
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
849
+ 2024-07-10 07:56:34 | ERROR | stderr | result = context.run(func, *args)
850
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 633, in run_sync_iterator_async
851
+ 2024-07-10 07:56:34 | ERROR | stderr | return next(iterator)
852
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/utils.py", line 816, in gen_wrapper
853
+ 2024-07-10 07:56:34 | ERROR | stderr | response = next(iterator)
854
+ 2024-07-10 07:56:34 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_named.py", line 237, in bot_response_multi
855
+ 2024-07-10 07:56:34 | ERROR | stderr | if state0.skip_next:
856
+ 2024-07-10 07:56:34 | ERROR | stderr | AttributeError: 'NoneType' object has no attribute 'skip_next'
857
+ 2024-07-10 07:58:59 | ERROR | stderr | ERROR: Exception in ASGI application
858
+ 2024-07-10 07:58:59 | ERROR | stderr | Traceback (most recent call last):
859
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/routes.py", line 404, in main
860
+ 2024-07-10 07:58:59 | ERROR | stderr | @app.get("/assets/{path:path}")
861
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/templating.py", line 229, in TemplateResponse
862
+ 2024-07-10 07:58:59 | ERROR | stderr | template = self.get_template(name)
863
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/templating.py", line 143, in get_template
864
+ 2024-07-10 07:58:59 | ERROR | stderr | return self.env.get_template(name)
865
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/environment.py", line 1013, in get_template
866
+ 2024-07-10 07:58:59 | ERROR | stderr | return self._load_template(name, globals)
867
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/environment.py", line 972, in _load_template
868
+ 2024-07-10 07:58:59 | ERROR | stderr | template = self.loader.load(self, name, self.make_globals(globals))
869
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/loaders.py", line 126, in load
870
+ 2024-07-10 07:58:59 | ERROR | stderr | source, filename, uptodate = self.get_source(environment, name)
871
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/loaders.py", line 207, in get_source
872
+ 2024-07-10 07:58:59 | ERROR | stderr | raise TemplateNotFound(template)
873
+ 2024-07-10 07:58:59 | ERROR | stderr | jinja2.exceptions.TemplateNotFound: frontend/index.html
874
+ 2024-07-10 07:58:59 | ERROR | stderr |
875
+ 2024-07-10 07:58:59 | ERROR | stderr | The above exception was the direct cause of the following exception:
876
+ 2024-07-10 07:58:59 | ERROR | stderr |
877
+ 2024-07-10 07:58:59 | ERROR | stderr | Traceback (most recent call last):
878
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
879
+ 2024-07-10 07:58:59 | ERROR | stderr | result = await app( # type: ignore[func-returns-value]
880
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
881
+ 2024-07-10 07:58:59 | ERROR | stderr | return await self.app(scope, receive, send)
882
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
883
+ 2024-07-10 07:58:59 | ERROR | stderr | await super().__call__(scope, receive, send)
884
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
885
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.middleware_stack(scope, receive, send)
886
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
887
+ 2024-07-10 07:58:59 | ERROR | stderr | raise exc
888
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
889
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.app(scope, receive, _send)
890
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 714, in __call__
891
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
892
+ 2024-07-10 07:58:59 | ERROR | stderr | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
893
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
894
+ 2024-07-10 07:58:59 | ERROR | stderr | raise exc
895
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
896
+ 2024-07-10 07:58:59 | ERROR | stderr | await app(scope, receive, sender)
897
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
898
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.middleware_stack(scope, receive, send)
899
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
900
+ 2024-07-10 07:58:59 | ERROR | stderr | await route.handle(scope, receive, send)
901
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
902
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.app(scope, receive, send)
903
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
904
+ 2024-07-10 07:58:59 | ERROR | stderr | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
905
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
906
+ 2024-07-10 07:58:59 | ERROR | stderr | raise exc
907
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
908
+ 2024-07-10 07:58:59 | ERROR | stderr | await app(scope, receive, sender)
909
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
910
+ 2024-07-10 07:58:59 | ERROR | stderr | response = await func(request)
911
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
912
+ 2024-07-10 07:58:59 | ERROR | stderr | raw_response = await run_endpoint_function(
913
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
914
+ 2024-07-10 07:58:59 | ERROR | stderr | return await run_in_threadpool(dependant.call, **values)
915
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
916
+ 2024-07-10 07:58:59 | ERROR | stderr | return await anyio.to_thread.run_sync(func, *args)
917
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
918
+ 2024-07-10 07:58:59 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
919
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
920
+ 2024-07-10 07:58:59 | ERROR | stderr | return await future
921
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
922
+ 2024-07-10 07:58:59 | ERROR | stderr | result = context.run(func, *args)
923
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/routes.py", line 419, in main
924
+ 2024-07-10 07:58:59 | ERROR | stderr | async def reverse_proxy(url_path: str):
925
+ 2024-07-10 07:58:59 | ERROR | stderr | ValueError: Did you install Gradio from source files? You need to build the frontend by running /scripts/build_frontend.sh
926
+ 2024-07-10 07:58:59 | ERROR | stderr | ERROR: Exception in ASGI application
927
+ 2024-07-10 07:58:59 | ERROR | stderr | Traceback (most recent call last):
928
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/routes.py", line 404, in main
929
+ 2024-07-10 07:58:59 | ERROR | stderr | @app.get("/assets/{path:path}")
930
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/templating.py", line 229, in TemplateResponse
931
+ 2024-07-10 07:58:59 | ERROR | stderr | template = self.get_template(name)
932
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/templating.py", line 143, in get_template
933
+ 2024-07-10 07:58:59 | ERROR | stderr | return self.env.get_template(name)
934
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/environment.py", line 1013, in get_template
935
+ 2024-07-10 07:58:59 | ERROR | stderr | return self._load_template(name, globals)
936
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/environment.py", line 972, in _load_template
937
+ 2024-07-10 07:58:59 | ERROR | stderr | template = self.loader.load(self, name, self.make_globals(globals))
938
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/loaders.py", line 126, in load
939
+ 2024-07-10 07:58:59 | ERROR | stderr | source, filename, uptodate = self.get_source(environment, name)
940
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/jinja2/loaders.py", line 207, in get_source
941
+ 2024-07-10 07:58:59 | ERROR | stderr | raise TemplateNotFound(template)
942
+ 2024-07-10 07:58:59 | ERROR | stderr | jinja2.exceptions.TemplateNotFound: frontend/index.html
943
+ 2024-07-10 07:58:59 | ERROR | stderr |
944
+ 2024-07-10 07:58:59 | ERROR | stderr | The above exception was the direct cause of the following exception:
945
+ 2024-07-10 07:58:59 | ERROR | stderr |
946
+ 2024-07-10 07:58:59 | ERROR | stderr | Traceback (most recent call last):
947
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
948
+ 2024-07-10 07:58:59 | ERROR | stderr | result = await app( # type: ignore[func-returns-value]
949
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
950
+ 2024-07-10 07:58:59 | ERROR | stderr | return await self.app(scope, receive, send)
951
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in __call__
952
+ 2024-07-10 07:58:59 | ERROR | stderr | await super().__call__(scope, receive, send)
953
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/applications.py", line 123, in __call__
954
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.middleware_stack(scope, receive, send)
955
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in __call__
956
+ 2024-07-10 07:58:59 | ERROR | stderr | raise exc
957
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in __call__
958
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.app(scope, receive, _send)
959
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/route_utils.py", line 714, in __call__
960
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
961
+ 2024-07-10 07:58:59 | ERROR | stderr | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
962
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
963
+ 2024-07-10 07:58:59 | ERROR | stderr | raise exc
964
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
965
+ 2024-07-10 07:58:59 | ERROR | stderr | await app(scope, receive, sender)
966
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 756, in __call__
967
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.middleware_stack(scope, receive, send)
968
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 776, in app
969
+ 2024-07-10 07:58:59 | ERROR | stderr | await route.handle(scope, receive, send)
970
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle
971
+ 2024-07-10 07:58:59 | ERROR | stderr | await self.app(scope, receive, send)
972
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 77, in app
973
+ 2024-07-10 07:58:59 | ERROR | stderr | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
974
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
975
+ 2024-07-10 07:58:59 | ERROR | stderr | raise exc
976
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
977
+ 2024-07-10 07:58:59 | ERROR | stderr | await app(scope, receive, sender)
978
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/routing.py", line 72, in app
979
+ 2024-07-10 07:58:59 | ERROR | stderr | response = await func(request)
980
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app
981
+ 2024-07-10 07:58:59 | ERROR | stderr | raw_response = await run_endpoint_function(
982
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/fastapi/routing.py", line 193, in run_endpoint_function
983
+ 2024-07-10 07:58:59 | ERROR | stderr | return await run_in_threadpool(dependant.call, **values)
984
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/starlette/concurrency.py", line 42, in run_in_threadpool
985
+ 2024-07-10 07:58:59 | ERROR | stderr | return await anyio.to_thread.run_sync(func, *args)
986
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/to_thread.py", line 56, in run_sync
987
+ 2024-07-10 07:58:59 | ERROR | stderr | return await get_async_backend().run_sync_in_worker_thread(
988
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
989
+ 2024-07-10 07:58:59 | ERROR | stderr | return await future
990
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 859, in run
991
+ 2024-07-10 07:58:59 | ERROR | stderr | result = context.run(func, *args)
992
+ 2024-07-10 07:58:59 | ERROR | stderr | File "/usr/local/lib/python3.10/site-packages/gradio/routes.py", line 419, in main
993
+ 2024-07-10 07:58:59 | ERROR | stderr | async def reverse_proxy(url_path: str):
994
+ 2024-07-10 07:58:59 | ERROR | stderr | ValueError: Did you install Gradio from source files? You need to build the frontend by running /scripts/build_frontend.sh
995
+ 2024-07-10 07:59:35 | ERROR | stderr | Traceback (most recent call last):
996
+ 2024-07-10 07:59:35 | ERROR | stderr | File "/home/user/app/app.py", line 18, in <module>
997
+ 2024-07-10 07:59:35 | ERROR | stderr | main()
998
+ 2024-07-10 07:59:35 | ERROR | stderr | File "/home/user/app/app.py", line 11, in main
999
+ 2024-07-10 07:59:35 | ERROR | stderr | states = build_side_by_side_vision_ui_named(
1000
+ 2024-07-10 07:59:35 | ERROR | stderr | File "/home/user/app/src/serve/gradio_block_arena_vision_named.py", line 320, in build_side_by_side_vision_ui_named
1001
+ 2024-07-10 07:59:35 | ERROR | stderr | textbox = gr.MultimodalTextbox(
1002
+ 2024-07-10 07:59:35 | ERROR | stderr | AttributeError: module 'gradio' has no attribute 'MultimodalTextbox'
1003
+ 2024-07-10 07:59:36 | INFO | stdout | IMPORTANT: You are using gradio version 4.16.0, however version 4.29.0 is available, please upgrade.
1004
+ 2024-07-10 07:59:36 | INFO | stdout | --------
gradio_web_server_multi.log CHANGED
The diff for this file is too large to render. See raw diff
 
src/__pycache__/conversation.cpython-310.pyc CHANGED
Binary files a/src/__pycache__/conversation.cpython-310.pyc and b/src/__pycache__/conversation.cpython-310.pyc differ
 
src/__pycache__/utils.cpython-310.pyc CHANGED
Binary files a/src/__pycache__/utils.cpython-310.pyc and b/src/__pycache__/utils.cpython-310.pyc differ
 
src/conversation.py CHANGED
@@ -357,7 +357,7 @@ class Conversation:
357
  """Given an image, return the base64 encoded image string."""
358
  from PIL import Image
359
  import requests
360
- from fastchat.utils import resize_image_and_return_image_in_bytes
361
 
362
  # Load image if it has not been loaded in yet
363
  if type(image) == str:
@@ -480,7 +480,7 @@ class Conversation:
480
  return ret
481
 
482
  def to_gemini_api_messages(self):
483
- from fastchat.utils import load_image
484
 
485
  if self.system_message == "":
486
  ret = []
@@ -686,6 +686,29 @@ class Conversation:
686
  "offset": self.offset,
687
  }
688
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
689
 
690
  # A global registry for all conversation templates
691
  conv_templates: Dict[str, Conversation] = {}
@@ -2062,7 +2085,7 @@ register_conv_template(
2062
 
2063
 
2064
  if __name__ == "__main__":
2065
- from fastchat.conversation import get_conv_template
2066
 
2067
  print("-- Vicuna template --")
2068
  conv = get_conv_template("vicuna_v1.1")
 
357
  """Given an image, return the base64 encoded image string."""
358
  from PIL import Image
359
  import requests
360
+ from src.utils import resize_image_and_return_image_in_bytes
361
 
362
  # Load image if it has not been loaded in yet
363
  if type(image) == str:
 
480
  return ret
481
 
482
  def to_gemini_api_messages(self):
483
+ from src.utils import load_image
484
 
485
  if self.system_message == "":
486
  ret = []
 
686
  "offset": self.offset,
687
  }
688
 
689
+ def convert_image_to_base64(image, max_image_size_mb):
690
+ """Given an image, return the base64 encoded image string."""
691
+ from PIL import Image
692
+ import requests
693
+ from src.utils import resize_image_and_return_image_in_bytes
694
+
695
+ # Load image if it has not been loaded in yet
696
+ if type(image) == str:
697
+ if image.startswith("http://") or image.startswith("https://"):
698
+ response = requests.get(image)
699
+ image = Image.open(BytesIO(response.content)).convert("RGB")
700
+ elif "base64" in image:
701
+ # OpenAI format is: data:image/jpeg;base64,{base64_encoded_image_str}
702
+ return image.split(",")[1]
703
+ else:
704
+ image = Image.open(image).convert("RGB")
705
+
706
+ image_bytes = resize_image_and_return_image_in_bytes(
707
+ image, max_image_size_mb
708
+ )
709
+ img_b64_str = base64.b64encode(image_bytes.getvalue()).decode()
710
+
711
+ return img_b64_str
712
 
713
  # A global registry for all conversation templates
714
  conv_templates: Dict[str, Conversation] = {}
 
2085
 
2086
 
2087
  if __name__ == "__main__":
2088
+ from src.conversation import get_conv_template
2089
 
2090
  print("-- Vicuna template --")
2091
  conv = get_conv_template("vicuna_v1.1")
src/model/__pycache__/model_adapter.cpython-310.pyc CHANGED
Binary files a/src/model/__pycache__/model_adapter.cpython-310.pyc and b/src/model/__pycache__/model_adapter.cpython-310.pyc differ
 
src/model/model_adapter.py CHANGED
@@ -2291,6 +2291,15 @@ class LlavaAdapter(BaseModelAdapter):
2291
  return "llava" in model_path.lower()
2292
 
2293
  def get_default_conv_template(self, model_path: str) -> Conversation:
 
 
 
 
 
 
 
 
 
2294
  model_path = model_path.lower()
2295
  if "34b" in model_path:
2296
  return get_conv_template("llava-chatml")
 
2291
  return "llava" in model_path.lower()
2292
 
2293
  def get_default_conv_template(self, model_path: str) -> Conversation:
2294
+ from loguru import logger
2295
+ logger.info("model_path {}", model_path)
2296
+ if model_path in ["llava-fire", "llava-original"]:
2297
+ from llava.conversation import conv_templates
2298
+ if model_path == "llava-fire":
2299
+ return conv_templates["llama_v3_student"].copy()
2300
+ else:
2301
+ return conv_templates["llama_v3"].copy()
2302
+
2303
  model_path = model_path.lower()
2304
  if "34b" in model_path:
2305
  return get_conv_template("llava-chatml")
src/model/model_llava.py CHANGED
@@ -2,6 +2,7 @@ from llava.model.builder import load_pretrained_model
2
  from llava.mm_utils import get_model_name_from_path, process_images, tokenizer_image_token
3
  from llava.constants import IMAGE_TOKEN_INDEX, DEFAULT_IMAGE_TOKEN
4
  from llava.conversation import conv_templates
 
5
 
6
  from PIL import Image
7
  import requests
@@ -57,5 +58,28 @@ def inference():
57
  print(text_outputs)
58
  return text_outputs
59
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
60
  if __name__ == "__main__":
61
  inference()
 
2
  from llava.mm_utils import get_model_name_from_path, process_images, tokenizer_image_token
3
  from llava.constants import IMAGE_TOKEN_INDEX, DEFAULT_IMAGE_TOKEN
4
  from llava.conversation import conv_templates
5
+ from loguru import logger
6
 
7
  from PIL import Image
8
  import requests
 
58
  print(text_outputs)
59
  return text_outputs
60
 
61
+
62
+ @spaces.GPU
63
+ def inference_by_prompt_and_image(prompt, images):
64
+ device = "cuda"
65
+ image_tensor = process_images(images, image_processor_llava, model_llava.config)
66
+ image_tensor = image_tensor.to(dtype=torch.float16, device=device)
67
+ input_ids = tokenizer_image_token(prompt, tokenizer_llava, IMAGE_TOKEN_INDEX, return_tensors="pt").unsqueeze(0).to(device)
68
+ image_sizes = [image.size for image in images]
69
+ logger.info("Shape: {};{}",input_ids.shape, image_tensor.shape)
70
+ with torch.inference_mode():
71
+ cont = model_llava.generate(
72
+ input_ids,
73
+ images=image_tensor,
74
+ image_sizes=image_sizes,
75
+ do_sample=False,
76
+ temperature=0,
77
+ max_new_tokens=256,
78
+ use_cache=True
79
+ )
80
+ text_outputs = tokenizer_llava.batch_decode(cont, skip_special_tokens=True)
81
+ logger.info("response={}", text_outputs)
82
+ return text_outputs
83
+
84
  if __name__ == "__main__":
85
  inference()
src/serve/__pycache__/gradio_block_arena_vision_named.cpython-310.pyc CHANGED
Binary files a/src/serve/__pycache__/gradio_block_arena_vision_named.cpython-310.pyc and b/src/serve/__pycache__/gradio_block_arena_vision_named.cpython-310.pyc differ
 
src/serve/__pycache__/gradio_web_server.cpython-310.pyc CHANGED
Binary files a/src/serve/__pycache__/gradio_web_server.cpython-310.pyc and b/src/serve/__pycache__/gradio_web_server.cpython-310.pyc differ
 
src/serve/gradio_block_arena_vision_named.py CHANGED
@@ -225,6 +225,7 @@ def add_text(
225
  post_processed_text = _prepare_text_with_image(
226
  states[i], text, images, csam_flag=csam_flag
227
  )
 
228
  states[i].conv.append_message(states[i].conv.roles[0], post_processed_text)
229
  states[i].conv.append_message(states[i].conv.roles[1], None)
230
  states[i].skip_next = False
 
225
  post_processed_text = _prepare_text_with_image(
226
  states[i], text, images, csam_flag=csam_flag
227
  )
228
+ logger.info(f"msg={post_processed_text}")
229
  states[i].conv.append_message(states[i].conv.roles[0], post_processed_text)
230
  states[i].conv.append_message(states[i].conv.roles[1], None)
231
  states[i].skip_next = False
src/serve/gradio_web_server.py CHANGED
@@ -115,7 +115,11 @@ class State:
115
  self.init_system_prompt(self.conv)
116
 
117
  def init_system_prompt(self, conv):
118
- system_prompt = conv.get_system_message()
 
 
 
 
119
  if len(system_prompt) == 0:
120
  return
121
  current_date = datetime.datetime.now().strftime("%Y-%m-%d")
@@ -310,9 +314,13 @@ def _prepare_text_with_image(state, text, images, csam_flag):
310
  # reset convo with new image
311
  state.conv = get_conversation_template(state.model_name)
312
 
313
- image = state.conv.convert_image_to_base64(
314
- image
315
- ) # PIL type is not JSON serializable
 
 
 
 
316
 
317
  if csam_flag:
318
  state.has_csam_image = True
@@ -452,8 +460,9 @@ def bot_response(
452
  logger.info(f"model_name: {model_name};model_api_dict: {model_api_dict}")
453
  if model_api_dict is None:
454
  if model_name == "llava-original":
455
- from src.model.model_llava import inference
456
- output_text = inference()[0]
 
457
  else:
458
  output_text = "hello"
459
  stream_iter = [{
 
115
  self.init_system_prompt(self.conv)
116
 
117
  def init_system_prompt(self, conv):
118
+ if hasattr(conv, "get_system_message"):
119
+ system_prompt = conv.get_system_message()
120
+ elif (conv, "system"):
121
+ system_prompt = conv.system
122
+ return # No need for system prompt
123
  if len(system_prompt) == 0:
124
  return
125
  current_date = datetime.datetime.now().strftime("%Y-%m-%d")
 
314
  # reset convo with new image
315
  state.conv = get_conversation_template(state.model_name)
316
 
317
+ if hasattr(state.conv, "convert_image_to_base64"):
318
+ image = state.conv.convert_image_to_base64(
319
+ image
320
+ ) # PIL type is not JSON serializable
321
+ else:
322
+ from src.conversation import convert_image_to_base64
323
+ image = convert_image_to_base64(image, None)
324
 
325
  if csam_flag:
326
  state.has_csam_image = True
 
460
  logger.info(f"model_name: {model_name};model_api_dict: {model_api_dict}")
461
  if model_api_dict is None:
462
  if model_name == "llava-original":
463
+ from src.model.model_llava import inference, inference_by_prompt_and_images
464
+ logger.info(f"prompt: {conv.get_prompt()}; images: {images}")
465
+ output_text = inference_by_prompt_and_images(conv.get_prompt(), images)[0]
466
  else:
467
  output_text = "hello"
468
  stream_iter = [{
src/utils.py CHANGED
@@ -474,6 +474,8 @@ def image_moderation_request(image_bytes, endpoint, api_key):
474
 
475
 
476
  def image_moderation_provider(image, api_type):
 
 
477
  if api_type == "nsfw":
478
  endpoint = os.environ["AZURE_IMG_MODERATION_ENDPOINT"]
479
  api_key = os.environ["AZURE_IMG_MODERATION_API_KEY"]
 
474
 
475
 
476
  def image_moderation_provider(image, api_type):
477
+ print("skip for now")
478
+ return False
479
  if api_type == "nsfw":
480
  endpoint = os.environ["AZURE_IMG_MODERATION_ENDPOINT"]
481
  api_key = os.environ["AZURE_IMG_MODERATION_API_KEY"]
vision-tmp-2024-07-10-conv.json CHANGED
@@ -3,3 +3,5 @@
3
  {"tstamp": 1720588858.8939, "type": "chat", "model": "llava-fire", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720588858.8843, "finish": 1720588858.8939, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "d994e3d859c94bddbf0dfcaed6c63079", "model_name": "llava-fire", "has_csam_image": false}, "ip": "46.3.240.105"}
4
  {"tstamp": 1720588858.8951, "type": "chat", "model": "llava-original", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720588858.8863, "finish": 1720588858.8951, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "e41b47f05f8b44ff9d520c8e94c6e8de", "model_name": "llava-original", "has_csam_image": false}, "ip": "46.3.240.105"}
5
  {"tstamp": 1720589062.1758, "type": "chat", "model": "llava-fire", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720589047.9171, "finish": 1720589062.1758, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "963f15cd5e224eb8ae02c67ed37b93c4", "model_name": "llava-fire", "has_csam_image": false}, "ip": "46.3.240.105"}
 
 
 
3
  {"tstamp": 1720588858.8939, "type": "chat", "model": "llava-fire", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720588858.8843, "finish": 1720588858.8939, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "d994e3d859c94bddbf0dfcaed6c63079", "model_name": "llava-fire", "has_csam_image": false}, "ip": "46.3.240.105"}
4
  {"tstamp": 1720588858.8951, "type": "chat", "model": "llava-original", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720588858.8863, "finish": 1720588858.8951, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "e41b47f05f8b44ff9d520c8e94c6e8de", "model_name": "llava-original", "has_csam_image": false}, "ip": "46.3.240.105"}
5
  {"tstamp": 1720589062.1758, "type": "chat", "model": "llava-fire", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720589047.9171, "finish": 1720589062.1758, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "963f15cd5e224eb8ae02c67ed37b93c4", "model_name": "llava-fire", "has_csam_image": false}, "ip": "46.3.240.105"}
6
+ {"tstamp": 1720589240.391, "type": "chat", "model": "llava-fire", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720589221.915, "finish": 1720589240.391, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "hello"]], "offset": 0, "conv_id": "062c8dbcf02d4779ad5cb89f836a7480", "model_name": "llava-fire", "has_csam_image": false}, "ip": "46.3.240.106"}
7
+ {"tstamp": 1720589240.3917, "type": "chat", "model": "llava-original", "gen_params": {"temperature": 0.7, "top_p": 1.0, "max_new_tokens": 1024}, "start": 1720589221.917, "finish": 1720589240.3917, "state": {"template_name": "vicuna_v1.1", "system_message": "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.", "roles": ["USER", "ASSISTANT"], "messages": [["USER", "Hello"], ["ASSISTANT", "The figure in the image is a can of S\u00f6dra Almighy, which is a beer. S\u00f6dra is a Swedish brewery known for its lager beers. The can's design is modern and minimalist, with a color scheme that includes black and white, and the text is in English. The label indicates that it is a dry-hopped beer, which means it has been flavored with hops that have not been steeped in the brewing process, giving it a unique taste profile. The can's design suggests a contemporary and possibly craft beer, which is often associated with a more complex flavor profile than traditional lagers."]], "offset": 0, "conv_id": "3255dc27032345ba845813c17981f372", "model_name": "llava-original", "has_csam_image": false}, "ip": "46.3.240.106"}