Compare commits
10 Commits
b68b7aac74
...
4921590d3c
Author | SHA1 | Date |
---|---|---|
|
4921590d3c | |
|
69507fb7da | |
|
95cb77356c | |
|
a40cd2e63c | |
|
1a5e5eb5dd | |
|
5c3a34ef12 | |
|
ff73928c7d | |
|
9d4f8639d6 | |
|
e19182e3d5 | |
|
5cd4752a16 |
|
@ -4,7 +4,7 @@ tags:
|
||||||
- financial-sentiment-analysis
|
- financial-sentiment-analysis
|
||||||
- sentiment-analysis
|
- sentiment-analysis
|
||||||
widget:
|
widget:
|
||||||
- text: "Stocks rallied and the British pound gained."
|
- text: "growth is strong and we have plenty of liquidity"
|
||||||
---
|
---
|
||||||
|
|
||||||
`FinBERT` is a BERT model pre-trained on financial communication text. The purpose is to enhance financial NLP research and practice. It is trained on the following three financial communication corpus. The total corpora size is 4.9B tokens.
|
`FinBERT` is a BERT model pre-trained on financial communication text. The purpose is to enhance financial NLP research and practice. It is trained on the following three financial communication corpus. The total corpora size is 4.9B tokens.
|
||||||
|
@ -14,10 +14,13 @@ widget:
|
||||||
|
|
||||||
More technical details on `FinBERT`: [Click Link](https://github.com/yya518/FinBERT)
|
More technical details on `FinBERT`: [Click Link](https://github.com/yya518/FinBERT)
|
||||||
|
|
||||||
Please check out our working paper [`FinBERT—A Deep Learning Approach to Extracting Textual Information`](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3910214).
|
|
||||||
|
|
||||||
This released `finbert-tone` model is the `FinBERT` model fine-tuned on 10,000 manually annotated (positive, negative, neutral) sentences from analyst reports. This model achieves superior performance on financial tone analysis task. If you are simply interested in using `FinBERT` for financial tone analysis, give it a try.
|
This released `finbert-tone` model is the `FinBERT` model fine-tuned on 10,000 manually annotated (positive, negative, neutral) sentences from analyst reports. This model achieves superior performance on financial tone analysis task. If you are simply interested in using `FinBERT` for financial tone analysis, give it a try.
|
||||||
|
|
||||||
|
If you use the model in your academic work, please cite the following paper:
|
||||||
|
|
||||||
|
Huang, Allen H., Hui Wang, and Yi Yang. "FinBERT: A Large Language Model for Extracting Information from Financial Text." *Contemporary Accounting Research* (2022).
|
||||||
|
|
||||||
|
|
||||||
# How to use
|
# How to use
|
||||||
You can use this model with Transformers pipeline for sentiment analysis.
|
You can use this model with Transformers pipeline for sentiment analysis.
|
||||||
```python
|
```python
|
||||||
|
|
15
config.json
15
config.json
|
@ -1,18 +1,17 @@
|
||||||
{
|
{
|
||||||
"architectures": [
|
"architectures": [
|
||||||
"BertForSequenceClassification"
|
"BertForSequenceClassification"
|
||||||
],
|
],
|
||||||
"id2label": {
|
"id2label": {
|
||||||
"0": "neutral",
|
"0": "Neutral",
|
||||||
"1": "positive",
|
"1": "Positive",
|
||||||
"2": "negative"
|
"2": "Negative"
|
||||||
},
|
},
|
||||||
"label2id": {
|
"label2id": {
|
||||||
"positive": 1,
|
"Positive": 1,
|
||||||
"negative": 2,
|
"Negative": 2,
|
||||||
"neutral": 0
|
"Neutral": 0
|
||||||
},
|
},
|
||||||
"model_type": "bert",
|
|
||||||
"attention_probs_dropout_prob": 0.1,
|
"attention_probs_dropout_prob": 0.1,
|
||||||
"hidden_act": "gelu",
|
"hidden_act": "gelu",
|
||||||
"hidden_dropout_prob": 0.1,
|
"hidden_dropout_prob": 0.1,
|
||||||
|
|
Binary file not shown.
Loading…
Reference in New Issue