Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
D
DeepGrail Tagger
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
PNRIA
Global Helper
DeepGrail Tagger
Commits
eeb4774c
Commit
eeb4774c
authored
3 years ago
by
Julien Rabault
Browse files
Options
Downloads
Patches
Plain Diff
Comment
parent
ea12a5e7
No related branches found
No related tags found
No related merge requests found
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
SuperTagger/Utils/SymbolTokenizer.py
+6
-7
6 additions, 7 deletions
SuperTagger/Utils/SymbolTokenizer.py
with
6 additions
and
7 deletions
SuperTagger/Utils/SymbolTokenizer.py
+
6
−
7
View file @
eeb4774c
...
...
@@ -11,33 +11,32 @@ def load_obj(name):
class
SymbolTokenizer
():
def
__init__
(
self
,
index_to_super
):
"""
@params
tokenizer (PretrainedTokenizer): Tokenizer that tokenizes text
"""
"""
@params
index_to_super: Dict for convert ID to tags
"""
self
.
index_to_super
=
index_to_super
self
.
super_to_index
=
{
v
:
int
(
k
)
for
k
,
v
in
self
.
index_to_super
.
items
()}
def
lenSuper
(
self
):
"""
@return len of dict for convert ID to tags
"""
return
len
(
self
.
index_to_super
)
+
1
def
convert_batchs_to_ids
(
self
,
tags
,
sents_tokenized
):
encoded_labels
=
[]
labels
=
[[
self
.
super_to_index
[
str
(
symbol
)]
for
symbol
in
sents
]
for
sents
in
tags
]
for
l
,
s
in
zip
(
labels
,
sents_tokenized
):
super_tok
=
pad_sequence
(
l
,
len
(
s
))
super_tok
=
pad_sequence
(
l
,
len
(
s
))
encoded_labels
.
append
(
super_tok
)
return
torch
.
tensor
(
encoded_labels
)
def
convert_ids_to_tags
(
self
,
tags_ids
):
labels
=
[[
self
.
index_to_super
[
int
(
symbol
)]
for
symbol
in
sents
if
self
.
index_to_super
[
int
(
symbol
)]
!=
'
<unk>
'
]
for
sents
in
tags_ids
]
labels
=
[[
self
.
index_to_super
[
int
(
symbol
)]
for
symbol
in
sents
if
self
.
index_to_super
[
int
(
symbol
)]
!=
'
<unk>
'
]
for
sents
in
tags_ids
]
return
labels
def
pad_sequence
(
sequences
,
max_len
=
400
):
padded
=
[
0
]
*
max_len
padded
[:
len
(
sequences
)]
=
sequences
return
padded
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment