Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to deal Paper Count=0 | Relevant Papers=0 | Current Evidence=0 #782

Open
chenzf11 opened this issue Dec 31, 2024 · 3 comments
Open

how to deal Paper Count=0 | Relevant Papers=0 | Current Evidence=0 #782

chenzf11 opened this issue Dec 31, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@chenzf11
Copy link

Using the local ollama model, code like this:

from paperqa import Settings, ask
from paperqa.settings import AgentSettings
import os

os.environ['OPENAI_API_KEY'] = "ollama"

local_llm_config = {
    "model_list": [
        {
            "model_name": "ollama/qwen:32b-chat",
            "litellm_params": {
                "model": "ollama/qwen:32b-chat",
                "api_base": "http://xxx:11434",
            },
        },   
    ]
}

answer = ask(
    "What is RAG?",
    settings=Settings(
        llm="ollama/qwen:32b-chat",
        llm_config=local_llm_config,
        summary_llm="ollama/qwen:32b-chat",
        summary_llm_config=local_llm_config,
        embedding="ollama/bge-m3:latest",
        agent=AgentSettings(
            agent_llm='ollama/qwen:32b-chat', 
            agent_llm_config=local_llm_config
        ),
        paper_directory="docs"
    ),
)

The result seems to have not processed the input file

Encountered exception during tool call for tool paper_search: RuntimeError('Index pqa_index_6714f503dc3c5f6bb733600ea5444832 was empty, please rebuild it.')

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

[17:33:30] Trajectory failed.                                                                                                                                                   
           ╭──────────────────────────────────────────────────────────────── Traceback (most recent call last) ────────────────────────────────────────────────────────────────╮
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/main.py:486 in acompletion                                                                   │
           │                                                                                                                                                                   │
           │    483 │   │   │   │   │   response = ModelResponse(**init_response)                                                                                              │
           │    484 │   │   │   │   response = init_response                                                                                                                   │
           │    485 │   │   │   elif asyncio.iscoroutine(init_response):                                                                                                       │
           │ ❱  486 │   │   │   │   response = await init_response                                                                                                             │
           │    487 │   │   │   else:                                                                                                                                          │
           │    488 │   │   │   │   response = init_response  # type: ignore                                                                                                   │
           │    489                                                                                                                                                            │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/llms/custom_httpx/llm_http_handler.py:67 in async_completion                                 │
           │                                                                                                                                                                   │
           │    64 │   │   │   )                                                                                                                                               │
           │    65 │   │   except Exception as e:                                                                                                                              │
           │    66 │   │   │   raise self._handle_error(e=e, provider_config=provider_config)                                                                                  │
           │ ❱  67 │   │   return provider_config.transform_response(                                                                                                          │
           │    68 │   │   │   model=model,                                                                                                                                    │
           │    69 │   │   │   raw_response=response,                                                                                                                          │
           │    70 │   │   │   model_response=model_response,                                                                                                                  │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py:263 in transform_response                           │
           │                                                                                                                                                                   │
           │   260 │   │   │   │   │   {                                                                                                                                       │
           │   261 │   │   │   │   │   │   "id": f"call_{str(uuid.uuid4())}",                                                                                                  │
           │   262 │   │   │   │   │   │   "function": {                                                                                                                       │
           │ ❱ 263 │   │   │   │   │   │   │   "name": function_call["name"],                                                                                                  │
           │   264 │   │   │   │   │   │   │   "arguments": json.dumps(function_call["arguments"]),                                                                            │
           │   265 │   │   │   │   │   │   },                                                                                                                                  │
           │   266 │   │   │   │   │   │   "type": "function",                                                                                                                 │
           ╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
           KeyError: 'name'                                                                                                                                                     
                                                                                                                                                                                
           During handling of the above exception, another exception occurred:                                                                                                  
                                                                                                                                                                                
           ╭──────────────────────────────────────────────────────────────── Traceback (most recent call last) ────────────────────────────────────────────────────────────────╮
           │ /data/personal/zonghu.wang/project/material/rag/rag_evaluate/rag_eval/paperqa2/paper-qa/paperqa/agents/main.py:149 in _run_with_timeout_failure                   │
           │                                                                                                                                                                   │
           │   146 ) -> tuple[PQASession, AgentStatus]:                                                                                                                        │
           │   147 │   try:                                                                                                                                                    │
           │   148 │   │   async with asyncio.timeout(query.settings.agent.timeout):                                                                                           │
           │ ❱ 149 │   │   │   status = await rollout()                                                                                                                        │
           │   150 │   except TimeoutError:                                                                                                                                    │
           │   151 │   │   logger.warning(                                                                                                                                     │
           │   152 │   │   │   f"Agent timeout after {query.settings.agent.timeout}-sec, just answering."                                                                      │
           │                                                                                                                                                                   │
           │ /data/personal/zonghu.wang/project/material/rag/rag_evaluate/rag_eval/paperqa2/paper-qa/paperqa/agents/main.py:293 in rollout                                     │
           │                                                                                                                                                                   │
           │   290 │   │   │   │   )                                                                                                                                           │
           │   291 │   │   │   │   return AgentStatus.TRUNCATED                                                                                                                │
           │   292 │   │   │   agent_state.messages += obs                                                                                                                     │
           │ ❱ 293 │   │   │   for attempt in Retrying(                                                                                                                        │
           │   294 │   │   │   │   stop=stop_after_attempt(5),                                                                                                                 │
           │   295 │   │   │   │   retry=retry_if_exception_type(MalformedMessageError),                                                                                       │
           │   296 │   │   │   │   before_sleep=before_sleep_log(logger, logging.WARNING),                                                                                     │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/tenacity/__init__.py:443 in __iter__                                                                 │
           │                                                                                                                                                                   │
           │   440 │   │                                                                                                                                                       │
           │   441 │   │   retry_state = RetryCallState(self, fn=None, args=(), kwargs={})                                                                                     │
           │   442 │   │   while True:                                                                                                                                         │
           │ ❱ 443 │   │   │   do = self.iter(retry_state=retry_state)                                                                                                         │
           │   444 │   │   │   if isinstance(do, DoAttempt):                                                                                                                   │
           │   445 │   │   │   │   yield AttemptManager(retry_state=retry_state)                                                                                               │
           │   446 │   │   │   elif isinstance(do, DoSleep):                                                                                                                   │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/tenacity/__init__.py:376 in iter                                                                     │
           │                                                                                                                                                                   │
           │   373 │   │   self._begin_iter(retry_state)                                                                                                                       │
           │   374 │   │   result = None                                                                                                                                       │
           │   375 │   │   for action in self.iter_state.actions:                                                                                                              │
           │ ❱ 376 │   │   │   result = action(retry_state)                                                                                                                    │
           │   377 │   │   return result                                                                                                                                       │
           │   378 │                                                                                                                                                           │
           │   379 │   def _begin_iter(self, retry_state: "RetryCallState") -> None:  # noqa                                                                                   │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/tenacity/__init__.py:398 in <lambda>                                                                 │
           │                                                                                                                                                                   │
           │   395 │                                                                                                                                                           │
           │   396 │   def _post_retry_check_actions(self, retry_state: "RetryCallState") -> None:                                                                             │
           │   397 │   │   if not (self.iter_state.is_explicit_retry or self.iter_state.retry_run_result):                                                                     │
           │ ❱ 398 │   │   │   self._add_action_func(lambda rs: rs.outcome.result())                                                                                           │
           │   399 │   │   │   return                                                                                                                                          │
           │   400 │   │                                                                                                                                                       │
           │   401 │   │   if self.after is not None:                                                                                                                          │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/concurrent/futures/_base.py:449 in result                                                                          │
           │                                                                                                                                                                   │
           │   446 │   │   │   │   if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:                                                                                      │
           │   447 │   │   │   │   │   raise CancelledError()                                                                                                                  │
           │   448 │   │   │   │   elif self._state == FINISHED:                                                                                                               │
           │ ❱ 449 │   │   │   │   │   return self.__get_result()                                                                                                              │
           │   450 │   │   │   │                                                                                                                                               │
           │   451 │   │   │   │   self._condition.wait(timeout)                                                                                                               │
           │   452                                                                                                                                                             │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/concurrent/futures/_base.py:401 in __get_result                                                                    │
           │                                                                                                                                                                   │
           │   398 │   def __get_result(self):                                                                                                                                 │
           │   399 │   │   if self._exception:                                                                                                                                 │
           │   400 │   │   │   try:                                                                                                                                            │
           │ ❱ 401 │   │   │   │   raise self._exception                                                                                                                       │
           │   402 │   │   │   finally:                                                                                                                                        │
           │   403 │   │   │   │   # Break a reference cycle with the exception in self._exception                                                                             │
           │   404 │   │   │   │   self = None                                                                                                                                 │
           │                                                                                                                                                                   │
           │ /data/personal/zonghu.wang/project/material/rag/rag_evaluate/rag_eval/paperqa2/paper-qa/paperqa/agents/main.py:300 in rollout                                     │
           │                                                                                                                                                                   │
           │   297 │   │   │   │   reraise=True,                                                                                                                               │
           │   298 │   │   │   ):                                                                                                                                              │
           │   299 │   │   │   │   with attempt:  # Retrying if ToolSelector fails to select a tool                                                                            │
           │ ❱ 300 │   │   │   │   │   action = await agent(agent_state.messages, tools)                                                                                       │
           │   301 │   │   │   agent_state.messages = [*agent_state.messages, action]                                                                                          │
           │   302 │   │   │   if on_agent_action_callback:                                                                                                                    │
           │   303 │   │   │   │   await on_agent_action_callback(action, agent_state)                                                                                         │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/aviary/tools/utils.py:84 in __call__                                                                 │
           │                                                                                                                                                                   │
           │    81 │   │   │   self._ledger.messages.extend(messages)                                                                                                          │
           │    82 │   │   │   messages = self._ledger.messages                                                                                                                │
           │    83 │   │                                                                                                                                                       │
           │ ❱  84 │   │   model_response = await self._bound_acompletion(                                                                                                     │
           │    85 │   │   │   messages=MessagesAdapter.dump_python(                                                                                                           │
           │    86 │   │   │   │   messages, exclude_none=True, by_alias=True                                                                                                  │
           │    87 │   │   │   ),                                                                                                                                              │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:833 in acompletion                                                                 │
           │                                                                                                                                                                   │
           │    830 │   │   │   │   │   original_exception=e,                                                                                                                  │
           │    831 │   │   │   │   )                                                                                                                                          │
           │    832 │   │   │   )                                                                                                                                              │
           │ ❱  833 │   │   │   raise e                                                                                                                                        │
           │    834 │                                                                                                                                                          │
           │    835 │   async def _acompletion(                                                                                                                                │
           │    836 │   │   self, model: str, messages: List[Dict[str, str]], **kwargs                                                                                         │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:809 in acompletion                                                                 │
           │                                                                                                                                                                   │
           │    806 │   │   │   if request_priority is not None and isinstance(request_priority, int):                                                                         │
           │    807 │   │   │   │   response = await self.schedule_acompletion(**kwargs)                                                                                       │
           │    808 │   │   │   else:                                                                                                                                          │
           │ ❱  809 │   │   │   │   response = await self.async_function_with_fallbacks(**kwargs)                                                                              │
           │    810 │   │   │   end_time = time.time()                                                                                                                         │
           │    811 │   │   │   _duration = end_time - start_time                                                                                                              │
           │    812 │   │   │   asyncio.create_task(                                                                                                                           │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:2803 in async_function_with_fallbacks                                              │
           │                                                                                                                                                                   │
           │   2800 │   │   │   │   │   │   )                                                                                                                                  │
           │   2801 │   │   │   │   │   )                                                                                                                                      │
           │   2802 │   │   │                                                                                                                                                  │
           │ ❱ 2803 │   │   │   raise original_exception                                                                                                                       │
           │   2804 │                                                                                                                                                          │
           │   2805 │   def _handle_mock_testing_fallbacks(                                                                                                                    │
           │   2806 │   │   self,                                                                                                                                              │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:2619 in async_function_with_fallbacks                                              │
           │                                                                                                                                                                   │
           │   2616 │   │   │   │   content_policy_fallbacks=content_policy_fallbacks,                                                                                         │
           │   2617 │   │   │   )                                                                                                                                              │
           │   2618 │   │   │                                                                                                                                                  │
           │ ❱ 2619 │   │   │   response = await self.async_function_with_retries(                                                                                             │
           │   2620 │   │   │   │   *args, **kwargs, mock_timeout=mock_timeout                                                                                                 │
           │   2621 │   │   │   )                                                                                                                                              │
           │   2622 │   │   │   verbose_router_logger.debug(f"Async Response: {response}")                                                                                     │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:2981 in async_function_with_retries                                                │
           │                                                                                                                                                                   │
           │   2978 │   │   │   │   setattr(original_exception, "max_retries", num_retries)                                                                                    │
           │   2979 │   │   │   │   setattr(original_exception, "num_retries", current_attempt)                                                                                │
           │   2980 │   │   │                                                                                                                                                  │
           │ ❱ 2981 │   │   │   raise original_exception                                                                                                                       │
           │   2982 │                                                                                                                                                          │
           │   2983 │   async def make_call(self, original_function: Any, *args, **kwargs):                                                                                    │
           │   2984 │   │   """                                                                                                                                                │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:2887 in async_function_with_retries                                                │
           │                                                                                                                                                                   │
           │   2884 │   │   │   │   model_group=model_group, kwargs=kwargs                                                                                                     │
           │   2885 │   │   │   )                                                                                                                                              │
           │   2886 │   │   │   # if the function call is successful, no exception will be raised and we'll                                                                    │
           │        break out of the loop                                                                                                                                      │
           │ ❱ 2887 │   │   │   response = await self.make_call(original_function, *args, **kwargs)                                                                            │
           │   2888 │   │   │                                                                                                                                                  │
           │   2889 │   │   │   return response                                                                                                                                │
           │   2890 │   │   except Exception as e:                                                                                                                             │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:2990 in make_call                                                                  │
           │                                                                                                                                                                   │
           │   2987 │   │   model_group = kwargs.get("model")                                                                                                                  │
           │   2988 │   │   response = original_function(*args, **kwargs)                                                                                                      │
           │   2989 │   │   if inspect.iscoroutinefunction(response) or inspect.isawaitable(response):                                                                         │
           │ ❱ 2990 │   │   │   response = await response                                                                                                                      │
           │   2991 │   │   ## PROCESS RESPONSE HEADERS                                                                                                                        │
           │   2992 │   │   response = await self.set_response_headers(                                                                                                        │
           │   2993 │   │   │   response=response, model_group=model_group                                                                                                     │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:958 in _acompletion                                                                │
           │                                                                                                                                                                   │
           │    955 │   │   │   )                                                                                                                                              │
           │    956 │   │   │   if model_name is not None:                                                                                                                     │
           │    957 │   │   │   │   self.fail_calls[model_name] += 1                                                                                                           │
           │ ❱  958 │   │   │   raise e                                                                                                                                        │
           │    959 │                                                                                                                                                          │
           │    960 │   def _update_kwargs_before_fallbacks(self, model: str, kwargs: dict) -> None:                                                                           │
           │    961 │   │   """                                                                                                                                                │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/router.py:926 in _acompletion                                                                │
           │                                                                                                                                                                   │
           │    923 │   │   │   │   │   parent_otel_span=parent_otel_span,                                                                                                     │
           │    924 │   │   │   │   )                                                                                                                                          │
           │    925 │   │   │   │                                                                                                                                              │
           │ ❱  926 │   │   │   │   response = await _response                                                                                                                 │
           │    927 │   │   │                                                                                                                                                  │
           │    928 │   │   │   ## CHECK CONTENT FILTER ERROR ##                                                                                                               │
           │    929 │   │   │   if isinstance(response, ModelResponse):                                                                                                        │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/utils.py:1202 in wrapper_async                                                               │
           │                                                                                                                                                                   │
           │   1199 │   │   │   setattr(                                                                                                                                       │
           │   1200 │   │   │   │   e, "num_retries", num_retries                                                                                                              │
           │   1201 │   │   │   )  ## IMPORTANT: returns the deployment's num_retries to the router                                                                            │
           │ ❱ 1202 │   │   │   raise e                                                                                                                                        │
           │   1203 │                                                                                                                                                          │
           │   1204 │   is_coroutine = inspect.iscoroutinefunction(original_function)                                                                                          │
           │   1205                                                                                                                                                            │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/utils.py:1056 in wrapper_async                                                               │
           │                                                                                                                                                                   │
           │   1053 │   │   │   │   return _caching_handler_response.final_embedding_cached_response                                                                           │
           │   1054 │   │   │                                                                                                                                                  │
           │   1055 │   │   │   # MODEL CALL                                                                                                                                   │
           │ ❱ 1056 │   │   │   result = await original_function(*args, **kwargs)                                                                                              │
           │   1057 │   │   │   end_time = datetime.datetime.now()                                                                                                             │
           │   1058 │   │   │   if "stream" in kwargs and kwargs["stream"] is True:                                                                                            │
           │   1059 │   │   │   │   if (                                                                                                                                       │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/main.py:508 in acompletion                                                                   │
           │                                                                                                                                                                   │
           │    505 │   │   return response                                                                                                                                    │
           │    506 │   except Exception as e:                                                                                                                                 │
           │    507 │   │   custom_llm_provider = custom_llm_provider or "openai"                                                                                              │
           │ ❱  508 │   │   raise exception_type(                                                                                                                              │
           │    509 │   │   │   model=model,                                                                                                                                   │
           │    510 │   │   │   custom_llm_provider=custom_llm_provider,                                                                                                       │
           │    511 │   │   │   original_exception=e,                                                                                                                          │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py:2190 in exception_type                         │
           │                                                                                                                                                                   │
           │   2187 │   │   # don't let an error with mapping interrupt the user from receiving an error                                                                       │
           │        from the llm api calls                                                                                                                                     │
           │   2188 │   │   if exception_mapping_worked:                                                                                                                       │
           │   2189 │   │   │   setattr(e, "litellm_response_headers", litellm_response_headers)                                                                               │
           │ ❱ 2190 │   │   │   raise e                                                                                                                                        │
           │   2191 │   │   else:                                                                                                                                              │
           │   2192 │   │   │   for error_type in litellm.LITELLM_EXCEPTION_TYPES:                                                                                             │
           │   2193 │   │   │   │   if isinstance(e, error_type):                                                                                                              │
           │                                                                                                                                                                   │
           │ /opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py:2166 in exception_type                         │
           │                                                                                                                                                                   │
           │   2163 │   │   │   │   │   request=original_exception.request,                                                                                                    │
           │   2164 │   │   │   │   )                                                                                                                                          │
           │   2165 │   │   │   else:                                                                                                                                          │
           │ ❱ 2166 │   │   │   │   raise APIConnectionError(                                                                                                                  │
           │   2167 │   │   │   │   │   message="{}\n{}".format(                                                                                                               │
           │   2168 │   │   │   │   │   │   str(original_exception), traceback.format_exc()                                                                                    │
           │   2169 │   │   │   │   │   ),                                                                                                                                     │
           ╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
           APIConnectionError: litellm.APIConnectionError: 'name'                                                                                                               
           Traceback (most recent call last):                                                                                                                                   
             File "/opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/main.py", line 486, in acompletion                                                      
               response = await init_response                                                                                                                                   
                          ^^^^^^^^^^^^^^^^^^^                                                                                                                                   
             File "/opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 67, in async_completion                    
               return provider_config.transform_response(                                                                                                                       
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^                                                                                                                       
             File "/opt/anaconda/envs/llamafactory/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 263, in transform_response              
               "name": function_call["name"],                                                                                                                                   
                       ~~~~~~~~~~~~~^^^^^^^^                                                                                                                                    
           KeyError: 'name'                                                                                                                                                     
                                                                                                                                                                                
           Received Model Group=ollama/qwen:32b-chat                                                                                                                            
           Available Model Group Fallbacks=None LiteLLM Retried: 2 times, LiteLLM Max Retries: 3                                                                                
[17:33:32] Generating answer for 'What is RAG?'.                                                                                                                                
Could not find cost for model ollama/qwen:32b-chat.
[17:33:38] Status: Paper Count=0 | Relevant Papers=0 | Current Evidence=0 | Current Cost=$0.0000                                                                                
           Answer: RAG is an acronym that stands for Red-Amber-Green, which is a widely used traffic light system in project management, risk assessment, and other             
           organizational contexts to visually represent the status or progress of tasks, projects, or objectives. The system assigns colors to indicate different levels of    
           maturity, priority, or risk:                                                                                                                                         
                                                                                                                                                                                
           - **Red** typically signifies high risk, unaddressed issues, significant delays, or inadequate performance.                                                          
           - **Amber** denotes moderate risk, potential concerns, or areas that require attention but are not critical yet.                                                     
           - **Green** represents a satisfactory status, low risk, good progress, or acceptable performance.                                                                    
                                                                                                                                                                                
           The RAG system provides a simple and直观 way for teams and stakeholders to quickly assess the overall situation and prioritize actions. It is often used in project  
           portfolios, where multiple projects or tasks are monitored simultaneously, allowing decision-makers to focus on high-risk items first and ensure resources are       
           allocated effectively.                                                                                                                                               
                                                                                                                                                                                
           Although the context provided does not explicitly discuss RAG, the acronym's definition and use case are standard across various industries and settings. Therefore, 
           without specific information contradicting this explanation within the given context, it is safe to assume that the RAG system functions as described above.         
                                                                                                                                                                                
           Citation:                                                                                                                                                            
           (No direct source available in the provided context for the RAG acronym, but its description aligns with general understanding and widespread use.)  
Copy link

dosubot bot commented Dec 31, 2024

Hey @chenzf11! 👋 I'm here to help you with any bugs, questions, or contributions you have in mind. Let's squash those bugs together!

To resolve the error "Index pqa_index_6714f503dc3c5f6bb733600ea5444832 was empty, please rebuild it," you need to rebuild the index. This can be done by setting the build parameter to True when calling the get_directory_index function. Here's how you can modify your code to rebuild the index:

await get_directory_index(build=True)

This will ensure that the index is rebuilt, which should resolve the error. Additionally, make sure that the sync_index_w_directory parameter is set appropriately if you want to synchronize the index with the source directory. If the index was not previously built, its index_files will be empty, and rebuilding is necessary [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@chenzf11 chenzf11 changed the title Cannot parse PDF how to deal Paper Count=0 | Relevant Papers=0 | Current Evidence=0 Dec 31, 2024
@dosubot dosubot bot added the bug Something isn't working label Dec 31, 2024
@jamesbraza
Copy link
Collaborator

Hello @chenzf11 it looks like you used ask, which should build the index for you, but there is the error at the top of your logs: Index pqa_index_6714f503dc3c5f6bb733600ea5444832 was empty, please rebuild it.

You won't get any texts to process if you have an empty index, which explains the paper count of 0.

So the issue is your paper directory is not getting picked up. You used a relative path for the paper directory, are you sure docs is in the right place?


An aside is setting the paper_directory in Settings is now deprecated, move to Settings.index.paper_directory. Though it should still work with setting Settings.paper_directory.

@chenzf11
Copy link
Author

chenzf11 commented Jan 6, 2025

Thank! I found that the files in my folder were not PDF files. After changing the file extensions to PDF, I could start building the index, but I was unable to successfully call the ollama model to work. I checked #582 and it works

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants