Joshua Lochner
|
cba017031d
|
Add demo script
|
2023-02-20 17:14:29 +02:00 |
Joshua Lochner
|
fa3d4d4ae1
|
Create worker.js for demo
|
2023-02-20 17:14:13 +02:00 |
Joshua Lochner
|
7f854467a9
|
Create FUNDING.yml
|
2023-02-20 16:02:05 +02:00 |
Joshua Lochner
|
98e659eaa7
|
Dispatch callbacks on certain events
|
2023-02-20 03:51:27 +02:00 |
Joshua Lochner
|
17ad585f66
|
Only add global variables to window if not in worker
|
2023-02-20 01:19:31 +02:00 |
Joshua Lochner
|
82968dba62
|
Add import for ort in model.js
|
2023-02-20 01:18:54 +02:00 |
Joshua Lochner
|
6721b57fd8
|
Add callback function
|
2023-02-20 01:18:35 +02:00 |
Joshua Lochner
|
2f16b4a595
|
Add temperate and discount factor parameters to generate function
|
2023-02-19 17:15:30 +02:00 |
Joshua Lochner
|
6a17ba43bf
|
Fix beam search sampling
|
2023-02-19 15:32:10 +02:00 |
Joshua Lochner
|
9fbd742717
|
Abstract generate function
|
2023-02-19 11:57:18 +02:00 |
Joshua Lochner
|
6442932d53
|
Rename variable
|
2023-02-19 03:56:31 +02:00 |
Joshua Lochner
|
0b48767d1e
|
Replace for loop with for of loop
|
2023-02-19 03:56:04 +02:00 |
Joshua Lochner
|
558ea62cc1
|
Clean up T5 generation code
|
2023-02-19 03:53:08 +02:00 |
Joshua Lochner
|
8e2fd3df5d
|
Remove repeated code between softmax and log_softmax
|
2023-02-19 03:23:58 +02:00 |
Joshua Lochner
|
fd7b7b17c3
|
Add log_softmax function
|
2023-02-19 03:09:31 +02:00 |
Joshua Lochner
|
11b1b7c2a0
|
Add GPT2Tokenizer
|
2023-02-19 03:09:06 +02:00 |
Joshua Lochner
|
b29355f938
|
Remove old logging
|
2023-02-19 03:08:33 +02:00 |
Joshua Lochner
|
c9b4b6feb3
|
Implement beam search
|
2023-02-19 03:06:54 +02:00 |
Joshua Lochner
|
8414b0f9d8
|
Support decoding of multiple lists of token ids
|
2023-02-19 01:37:59 +02:00 |
Joshua Lochner
|
807b9216fe
|
Fix top-k sampling
|
2023-02-18 18:38:57 +02:00 |
Joshua Lochner
|
a11b81111c
|
Implement ByteLevel encoding and decoding (GPT2)
|
2023-02-18 15:48:23 +02:00 |
Joshua Lochner
|
e285f3afa0
|
Export `AutoModelForCausalLM`
|
2023-02-18 15:43:31 +02:00 |
Joshua Lochner
|
7952d17feb
|
Improve generate methods
|
2023-02-18 15:29:30 +02:00 |
Joshua Lochner
|
7d547b4928
|
Fix past key dimensions
|
2023-02-18 15:18:01 +02:00 |
Joshua Lochner
|
6f04da0d9b
|
Fix GPT generation and use optimum export for T5
|
2023-02-18 03:02:18 +02:00 |
Joshua Lochner
|
6e71ee125a
|
Reorganize generate method
|
2023-02-16 20:20:17 +02:00 |
Joshua Lochner
|
d2b4fadb8e
|
Fix `BertTokenizer` by including token_type_ids
|
2023-02-16 04:54:37 +02:00 |
Joshua Lochner
|
917b457323
|
Add `skip_special_tokens` option to decoder
|
2023-02-16 04:54:13 +02:00 |
Joshua Lochner
|
1c962fa219
|
Add sequence classification model
|
2023-02-16 04:52:01 +02:00 |
Joshua Lochner
|
5d7e6fe417
|
Create t5 model helper methods
Adapted from https://github.com/Ki6an/fastT5
|
2023-02-16 00:21:45 +02:00 |
Joshua Lochner
|
8a09f503da
|
Remove redundant variable
|
2023-02-16 00:20:26 +02:00 |
Joshua Lochner
|
f5ad3059df
|
Create .gitignore
|
2023-02-16 00:14:27 +02:00 |
Joshua Lochner
|
36ec1431f9
|
Add `T5ForConditionalGeneration` model
|
2023-02-16 00:14:20 +02:00 |
Joshua Lochner
|
e0a88f9453
|
Add conversion script for models
|
2023-02-16 00:11:49 +02:00 |
Joshua Lochner
|
2ef0f00e8e
|
Get basic model working
|
2023-02-15 02:40:32 +02:00 |
Joshua Lochner
|
8b7ab098bf
|
Add ort wasm
|
2023-02-15 02:40:13 +02:00 |
Joshua Lochner
|
fdef4896b2
|
Add `from_pretrained` method to `PreTrainedTokenizer`
|
2023-02-15 01:53:06 +02:00 |
Joshua Lochner
|
488b6829ea
|
Update imports and exports
|
2023-02-15 01:52:47 +02:00 |
Joshua Lochner
|
57300873c2
|
Rename classes
|
2023-02-15 01:09:51 +02:00 |
Joshua Lochner
|
4be15f9555
|
Create LICENSE
|
2023-02-15 00:57:04 +02:00 |
Joshua Lochner
|
82c207985b
|
Add v1 of scripts (tokenizers working)
|
2023-02-15 00:56:44 +02:00 |
Joshua Lochner
|
8c9281fc54
|
Add template
|
2023-02-15 00:54:29 +02:00 |
Joshua Lochner
|
e1ba11924c
|
Create model folder
|
2023-02-14 18:36:13 +02:00 |