Преглед изворни кода

[doc] Updated default value of quote in 'get answers' (#1093)

### What problem does this PR solve?

_Briefly describe what this PR aims to solve. Include background context
that will help reviewers understand the purpose of the PR._

### Type of change

- [x] Documentation Update
tags/v0.8.0
writinwaters пре 1 година
родитељ
комит
22468a8590
No account linked to committer's email address
2 измењених фајлова са 12 додато и 8 уклоњено
  1. 11
    7
      docs/guides/deploy_local_llm.md
  2. 1
    1
      docs/references/api.md

+ 11
- 7
docs/guides/deploy_local_llm.md Прегледај датотеку

@@ -115,34 +115,38 @@ Xorbits Inference([Xinference](https://github.com/xorbitsai/inference)) enables
- For a complete list of supported models, see the [Builtin Models](https://inference.readthedocs.io/en/latest/models/builtin/).
:::

To deploy a local model, e.g., **Llama3**, using Xinference:
To deploy a local model, e.g., **Mistral**, using Xinference:

### 1. Start an Xinference instance
### 1. Check firewall settings

Ensure that your host machine's firewall allows inbound connections on port 9997.

### 2. Start an Xinference instance

```bash
$ xinference-local --host 0.0.0.0 --port 9997
```

### 2. Launch your local model
### 3. Launch your local model

Launch your local model (**Mistral**), ensuring that you replace `${quantization}` with your chosen quantization method
:
```bash
$ xinference launch -u mistral --model-name mistral-v0.1 --size-in-billions 7 --model-format pytorch --quantization ${quantization}
```
### 3. Add Xinference
### 4. Add Xinference

In RAGFlow, click on your logo on the top right of the page **>** **Model Providers** and add Xinference to RAGFlow:

![add xinference](https://github.com/infiniflow/ragflow/assets/93570324/10635088-028b-4b3d-add9-5c5a6e626814)

### 4. Complete basic Xinference settings
### 5. Complete basic Xinference settings

Enter an accessible base URL, such as `http://<your-xinference-endpoint-domain>:9997/v1`.

### 5. Update System Model Settings
### 6. Update System Model Settings

Click on your logo **>** **Model Providers** **>** **System Model Settings** to update your model:
Click on your logo **>** **Model Providers** **>** **System Model Settings** to update your model.
*You should now be able to find **mistral** from the dropdown list under **Chat model**.*


+ 1
- 1
docs/references/api.md Прегледај датотеку

@@ -224,7 +224,7 @@ This method retrieves from RAGFlow the answer to the user's latest question.
|------------------|--------|----------|---------------|
| `conversation_id`| string | Yes | The ID of the conversation session. Call ['GET' /new_conversation](#create-conversation) to retrieve the ID.|
| `messages` | json | Yes | The latest question in a JSON form, such as `[{"role": "user", "content": "How are you doing!"}]`|
| `quote` | bool | No | Default: true |
| `quote` | bool | No | Default: false|
| `stream` | bool | No | Default: true |
| `doc_ids` | string | No | Document IDs delimited by comma, like `c790da40ea8911ee928e0242ac180005,23dsf34ree928e0242ac180005`. The retrieved contents will be confined to these documents. |


Loading…
Откажи
Сачувај