Compare commits
10 Commits
9f9bcd7ffd
...
62acf01b9c
Author | SHA1 | Date |
---|---|---|
|
62acf01b9c | |
|
0d2992cb02 | |
|
d15d4b8623 | |
|
100eb4c8eb | |
|
d404c4866b | |
|
f794482210 | |
|
28fcc1bff2 | |
|
6e5eb22d70 | |
|
5a9665edf7 | |
|
8ca2a4881d |
|
@ -14,3 +14,4 @@
|
||||||
*.pb filter=lfs diff=lfs merge=lfs -text
|
*.pb filter=lfs diff=lfs merge=lfs -text
|
||||||
*.pt filter=lfs diff=lfs merge=lfs -text
|
*.pt filter=lfs diff=lfs merge=lfs -text
|
||||||
*.pth filter=lfs diff=lfs merge=lfs -text
|
*.pth filter=lfs diff=lfs merge=lfs -text
|
||||||
|
*.msgpack filter=lfs diff=lfs merge=lfs -text
|
||||||
|
|
96
README.md
96
README.md
|
@ -1,55 +1,63 @@
|
||||||
# T5 One Line Summary⚡️
|
---
|
||||||
A T5 model trained on 370,000 research papers, to generate one line summary based on description/abstract of the papers
|
|
||||||
|
|
||||||
## Usage
|
datasets:
|
||||||
|
|
||||||
|
- arxiv
|
||||||
|
|
||||||
|
|
||||||
|
widget:
|
||||||
|
|
||||||
|
- text: "summarize: We describe a system called Overton, whose main design goal is to support engineers in building, monitoring, and improving production
|
||||||
|
machinelearning systems. Key challenges engineers face are monitoring fine-grained quality, diagnosing errors in sophisticated applications, and
|
||||||
|
handling contradictory or incomplete supervision data. Overton automates the life cycle of model construction, deployment, and monitoring by providing a set of novel high-level, declarative abstractions. Overton's vision is to shift developers to these higher-level tasks instead of lower-level machine learning tasks.
|
||||||
|
In fact, using Overton, engineers can build deep-learning-based applications without writing any code in frameworks like TensorFlow. For over a year,
|
||||||
|
Overton has been used in production to support multiple applications in both near-real-time applications and back-of-house processing.
|
||||||
|
In that time, Overton-based applications have answered billions of queries in multiple languages and processed trillions of records reducing errors
|
||||||
|
1.7-2.9 times versus production systems."
|
||||||
|
|
||||||
|
license: mit
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
# T5 One Line Summary
|
||||||
|
A T5 model trained on 370,000 research papers, to generate one line summary based on description/abstract of the papers. It is trained using [simpleT5](https://github.com/Shivanandroy/simpleT5) library - A python package built on top of pytorch lightning⚡️ & transformers🤗 to quickly train T5 models
|
||||||
|
|
||||||
|
## Usage:[](https://colab.research.google.com/drive/1HrfT8IKLXvZzPFpl1EhZ3s_iiXG3O2VY?usp=sharing)
|
||||||
|
```python
|
||||||
|
abstract = """We describe a system called Overton, whose main design goal is to support engineers in building, monitoring, and improving production
|
||||||
|
machine learning systems. Key challenges engineers face are monitoring fine-grained quality, diagnosing errors in sophisticated applications, and
|
||||||
|
handling contradictory or incomplete supervision data. Overton automates the life cycle of model construction, deployment, and monitoring by providing a
|
||||||
|
set of novel high-level, declarative abstractions. Overton's vision is to shift developers to these higher-level tasks instead of lower-level machine learning tasks.
|
||||||
|
In fact, using Overton, engineers can build deep-learning-based applications without writing any code in frameworks like TensorFlow. For over a year,
|
||||||
|
Overton has been used in production to support multiple applications in both near-real-time applications and back-of-house processing. In that time,
|
||||||
|
Overton-based applications have answered billions of queries in multiple languages and processed trillions of records reducing errors 1.7-2.9 times versus production systems.
|
||||||
|
"""
|
||||||
|
```
|
||||||
|
### Using Transformers🤗
|
||||||
```python
|
```python
|
||||||
model_name = "snrspeaks/t5-one-line-summary"
|
model_name = "snrspeaks/t5-one-line-summary"
|
||||||
|
|
||||||
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
|
||||||
|
|
||||||
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
|
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
|
||||||
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
||||||
|
input_ids = tokenizer.encode("summarize: " + abstract, return_tensors="pt", add_special_tokens=True)
|
||||||
abstract = """We describe a system called Overton, whose main design goal is to
|
generated_ids = model.generate(input_ids=input_ids,num_beams=5,max_length=50,repetition_penalty=2.5,length_penalty=1,early_stopping=True,num_return_sequences=3)
|
||||||
support engineers in building, monitoring, and improving production machine learning systems.
|
preds = [tokenizer.decode(g, skip_special_tokens=True, clean_up_tokenization_spaces=True) for g in generated_ids]
|
||||||
Key challenges engineers face are monitoring fine-grained quality, diagnosing errors in
|
|
||||||
sophisticated applications, and handling contradictory or incomplete supervision data.
|
|
||||||
Overton automates the life cycle of model construction, deployment, and monitoring by providing a
|
|
||||||
set of novel high-level, declarative abstractions. Overton's vision is to shift developers to
|
|
||||||
these higher-level tasks instead of lower-level machine learning tasks. In fact, using Overton,
|
|
||||||
engineers can build deep-learning-based applications without writing any code
|
|
||||||
in frameworks like TensorFlow. For over a year, Overton has been used in production to support multiple
|
|
||||||
applications in both near-real-time applications and back-of-house processing.
|
|
||||||
In that time, Overton-based applications have answered billions of queries in multiple
|
|
||||||
languages and processed trillions of records reducing errors 1.7-2.9 times versus production systems.
|
|
||||||
"""
|
|
||||||
|
|
||||||
input_ids = tokenizer.encode(
|
|
||||||
"summarize: " + abstract, return_tensors="pt", add_special_tokens=True
|
|
||||||
)
|
|
||||||
|
|
||||||
generated_ids = model.generate(
|
|
||||||
input_ids=input_ids,
|
|
||||||
num_beams=5,
|
|
||||||
max_length=50,
|
|
||||||
repetition_penalty=2.5,
|
|
||||||
length_penalty=1,
|
|
||||||
early_stopping=True,
|
|
||||||
num_return_sequences=3,
|
|
||||||
)
|
|
||||||
|
|
||||||
preds = [
|
|
||||||
tokenizer.decode(g, skip_special_tokens=True, clean_up_tokenization_spaces=True)
|
|
||||||
for g in generated_ids
|
|
||||||
]
|
|
||||||
|
|
||||||
print(preds)
|
print(preds)
|
||||||
|
|
||||||
# output
|
# output
|
||||||
|
["Overton: Building, Deploying, and Monitoring Machine Learning Systems for Engineers",
|
||||||
['Overton: Building, Deploying, and Monitoring Machine Learning Systems for Engineers',
|
"Overton: A System for Building, Monitoring, and Improving Production Machine Learning Systems",
|
||||||
|
"Overton: Building, Monitoring, and Improving Production Machine Learning Systems"]
|
||||||
'Overton: A System for Building, Monitoring, and Improving Production Machine Learning Systems',
|
```
|
||||||
|
### Using simpleT5⚡️
|
||||||
'Overton: Building, Monitoring, and Improving Production Machine Learning Systems']
|
```python
|
||||||
|
# pip install --upgrade simplet5
|
||||||
|
from simplet5 import SimpleT5
|
||||||
|
model = SimpleT5()
|
||||||
|
model.load_model("t5","snrspeaks/t5-one-line-summary")
|
||||||
|
model.predict(abstract)
|
||||||
|
|
||||||
|
# output
|
||||||
|
"Overton: Building, Deploying, and Monitoring Machine Learning Systems for Engineers"
|
||||||
```
|
```
|
Loading…
Reference in New Issue