Skip to content
GitLab
Explore
Sign in
Primary navigation
Search or go to…
Project
D
DeepGrail Linker
Manage
Activity
Members
Labels
Plan
Issues
Issue boards
Milestones
Wiki
Code
Merge requests
Repository
Branches
Commits
Tags
Repository graph
Compare revisions
Snippets
Build
Pipelines
Jobs
Pipeline schedules
Artifacts
Deploy
Releases
Package registry
Model registry
Operate
Environments
Terraform modules
Monitor
Incidents
Analyze
Value stream analytics
Contributor analytics
CI/CD analytics
Repository analytics
Model experiments
Help
Help
Support
GitLab documentation
Compare GitLab plans
Community forum
Contribute to GitLab
Provide feedback
Keyboard shortcuts
?
Snippets
Groups
Projects
Show more breadcrumbs
PNRIA
Global Helper
DeepGrail Linker
Commits
501425a2
Commit
501425a2
authored
3 years ago
by
Julien Rabault
Browse files
Options
Downloads
Patches
Plain Diff
Add logs
parent
ff8dc3af
Branches
Branches containing commit
No related tags found
2 merge requests
!6
Linker with transformer
,
!5
Linker with transformer
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
.gitignore
+1
-1
1 addition, 1 deletion
.gitignore
Linker/Linker.py
+1
-2
1 addition, 2 deletions
Linker/Linker.py
train.py
+3
-3
3 additions, 3 deletions
train.py
with
5 additions
and
6 deletions
.gitignore
+
1
−
1
View file @
501425a2
deepgrail_
Tagger
Super
Tagger
Utils/silver
Utils/silver
Utils/gold
Utils/gold
.idea
.idea
...
...
This diff is collapsed.
Click to expand it.
Linker/Linker.py
+
1
−
2
View file @
501425a2
...
@@ -6,7 +6,6 @@ import time
...
@@ -6,7 +6,6 @@ import time
import
torch
import
torch
import
torch.nn.functional
as
F
import
torch.nn.functional
as
F
from
torch
import
Module
from
torch.nn
import
Sequential
,
LayerNorm
,
Dropout
from
torch.nn
import
Sequential
,
LayerNorm
,
Dropout
from
torch.optim
import
AdamW
from
torch.optim
import
AdamW
from
torch.utils.data
import
TensorDataset
,
random_split
from
torch.utils.data
import
TensorDataset
,
random_split
...
@@ -23,7 +22,7 @@ from Linker.atom_map import atom_map
...
@@ -23,7 +22,7 @@ from Linker.atom_map import atom_map
from
Linker.eval
import
mesure_accuracy
,
SinkhornLoss
from
Linker.eval
import
mesure_accuracy
,
SinkhornLoss
from
Linker.utils_linker
import
find_pos_neg_idexes
,
get_atoms_batch
,
FFN
,
get_axiom_links
,
get_pos_encoding_for_s_idx
,
\
from
Linker.utils_linker
import
find_pos_neg_idexes
,
get_atoms_batch
,
FFN
,
get_axiom_links
,
get_pos_encoding_for_s_idx
,
\
get_neg_encoding_for_s_idx
get_neg_encoding_for_s_idx
from
Super
t
agger
import
*
from
Super
T
agger
import
*
from
utils
import
pad_sequence
from
utils
import
pad_sequence
def
format_time
(
elapsed
):
def
format_time
(
elapsed
):
...
...
This diff is collapsed.
Click to expand it.
train.py
+
3
−
3
View file @
501425a2
...
@@ -5,13 +5,13 @@ from utils import read_csv_pgbar
...
@@ -5,13 +5,13 @@ from utils import read_csv_pgbar
torch
.
cuda
.
empty_cache
()
torch
.
cuda
.
empty_cache
()
batch_size
=
int
(
Configuration
.
modelTrainingConfig
[
'
batch_size
'
])
batch_size
=
int
(
Configuration
.
modelTrainingConfig
[
'
batch_size
'
])
nb_sentences
=
batch_size
*
40
nb_sentences
=
batch_size
*
2
epochs
=
int
(
Configuration
.
modelTrainingConfig
[
'
epoch
'
])
epochs
=
int
(
Configuration
.
modelTrainingConfig
[
'
epoch
'
])
file_path_axiom_links
=
'
Datasets/gold_dataset_links.csv
'
file_path_axiom_links
=
'
Datasets/gold_dataset_links.csv
'
df_axiom_links
=
read_csv_pgbar
(
file_path_axiom_links
,
nb_sentences
)
df_axiom_links
=
read_csv_pgbar
(
file_path_axiom_links
,
nb_sentences
)
print
(
"
Linker
"
)
print
(
"
Linker
"
)
linker
=
Linker
(
"
models/
model_supertagger
.pt
"
)
linker
=
Linker
(
"
models/
flaubert_super_98%_V2_50e
.pt
"
)
print
(
"
Linker Training
"
)
print
(
"
\n
Linker Training
\n\n
"
)
linker
.
train_linker
(
df_axiom_links
,
validation_rate
=
0.1
,
epochs
=
epochs
,
batch_size
=
batch_size
,
checkpoint
=
False
,
tensorboard
=
True
)
linker
.
train_linker
(
df_axiom_links
,
validation_rate
=
0.1
,
epochs
=
epochs
,
batch_size
=
batch_size
,
checkpoint
=
False
,
tensorboard
=
True
)
This diff is collapsed.
Click to expand it.
Preview
0%
Loading
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Save comment
Cancel
Please
register
or
sign in
to comment