Size: a a a

Natural Language Processing

2020 July 14

I

I Апрельский... in Natural Language Processing
Never Give up
I am reading a pepers about Aspect based sentiment analysis and I know that word could be represented by word embedding ,tf-idf ,bag of words but I read in some papers they  using morphological, syntactic and semantic features without any mention how they encode it what this means how they take the feature how they encode it to number if anayone know blog ,book or video or anything explain it I will be thankful
what language are you talking about?
источник

GZ

German Zvonchuk in Natural Language Processing
I Апрельский
на ум приходит такой форклоу, который кажется типичный:
1) выделяешь сущности в тексте
2) нормализуешь
3) ищешь по нормализованным
Форклоу? А как это по английски?

Я сейчас смотрю spacy и prodigy, но не знаю, поможет он мне или нет.
источник

p

parvez in Natural Language Processing
I have a question regarding text pre processing:

1. Anyone have list to contranst lemmetisation and stemming  from different libraries  like NLTK , spacy , keras.preprocessing and others  .

2 . How should we decide that which tool will be best suited for the task  

3. Does working with deep learning models require preprocessing ?( I am still new to seq2seq models Any certain practices to keep in mind for the above said task .


thank you.
источник

p

parvez in Natural Language Processing
parvez
I have a question regarding text pre processing:

1. Anyone have list to contranst lemmetisation and stemming  from different libraries  like NLTK , spacy , keras.preprocessing and others  .

2 . How should we decide that which tool will be best suited for the task  

3. Does working with deep learning models require preprocessing ?( I am still new to seq2seq models Any certain practices to keep in mind for the above said task .


thank you.
for English language .
источник

I

I Апрельский... in Natural Language Processing
German Zvonchuk
Форклоу? А как это по английски?

Я сейчас смотрю spacy и prodigy, но не знаю, поможет он мне или нет.
сорян. воркфлоу. workflow. ну короче "алгоритм"
источник

NG

Never Give up in Natural Language Processing
I Апрельский
what language are you talking about?
any language ,The paper about Arabic but I want to understand this for any language
источник

I

I Апрельский... in Natural Language Processing
Never Give up
any language ,The paper about Arabic but I want to understand this for any language
источник

I

I Апрельский... in Natural Language Processing
So, I see authors use tool named AraNLP
источник

I

I Апрельский... in Natural Language Processing
источник

NG

Never Give up in Natural Language Processing
I Апрельский
So, I see authors use tool named AraNLP
I am reading the same paper but because I dont have good background I dont know how the converting the features to numbers
источник

NG

Never Give up in Natural Language Processing
Thank you a lot and I am sorry for the annoying 🙈
источник
2020 July 15

I

I Апрельский... in Natural Language Processing
Never Give up
I am reading the same paper but because I dont have good background I dont know how the converting the features to numbers
Ok. I got it.
I think you need to read something fairly basic. This raises the question of which pipeline you want to use. Roughly speaking, shallow models or transfer learning... Although, probably, in both cases, features will need to be encoded using OHE. I'll try to find some tutorial now.
источник

NG

Never Give up in Natural Language Processing
I Апрельский
Ok. I got it.
I think you need to read something fairly basic. This raises the question of which pipeline you want to use. Roughly speaking, shallow models or transfer learning... Although, probably, in both cases, features will need to be encoded using OHE. I'll try to find some tutorial now.
Yes This what I want thank you
источник

D

Den in Natural Language Processing
Доброе утро, возник такой вопрос:
насколько возможно вообще в принципе на сегодняшний день сделать реплику GPT-3 (для не дата саентистов)?
Вот на примере второго https://blog.usejournal.com/opengpt-2-we-replicated-gpt-2-because-you-can-too-45e34e6d36dc
источник

t

toriningen in Natural Language Processing
Den
Доброе утро, возник такой вопрос:
насколько возможно вообще в принципе на сегодняшний день сделать реплику GPT-3 (для не дата саентистов)?
Вот на примере второго https://blog.usejournal.com/opengpt-2-we-replicated-gpt-2-because-you-can-too-45e34e6d36dc
вопрос в доступности алгоритмов или железа? 🙂
источник

D

Den in Natural Language Processing
тобишь сделать реплику приемлимой золотой середины между 13B и 175B
источник

D

Den in Natural Language Processing
toriningen
вопрос в доступности алгоритмов или железа? 🙂
всего вышеперечисленного
источник

t

toriningen in Natural Language Processing
алгоритмы они вроде бы не меняли, архитектура открытая, они просто настакали слоев еще больше и контекст расширили
источник

t

toriningen in Natural Language Processing
а железо... вопрос в бюджете ¯\_(ツ)_/¯
источник

t

toriningen in Natural Language Processing
было бы у меня столько денег, сколько у них, наверное, я бы тоже gpt3 от скуки тренил
источник