tz March 22nd, 2022 at 02:35 pm
    Stanford Neural Machine Translation Systems for Spoken Language Domains.

    Minh-Thang Luong and Christopher D. Manning.
    http://www.statmt.org/OSMOSES/Stanford-IWSLT-15.pdf

    Fast domain adaptation for neural machine translation.

    Markus Freitag and Yaser Al-Onaizan.

    http://arxiv.org/abs/1612.06897

    Domain specialization: a post-training domain adaptation for neural machine translation.

    Christophe Servan, Josep Maria Crego, and Jean Senellart.

    http://arxiv.org/abs/1612.06141

    An empirical comparison of simple domain adaptation methods for neural machine translation.

    Chenhui Chu, Raj Dabre, and Sadao Kurohashi.

    http://arxiv.org/abs/1701.03214

    Neural Machine Translation Training in a Multi-Domain Scenario.

    Hassan Sajjad, Nadir Durrani, Fahim Dalvi, Yonatan Belinkov, Stephan Vogel

    https://arxiv.org/pdf/1708.08712.pdf
    ————————————————
    版权声明:本文为CSDN博主「warrioR_wx」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。
    原文链接:https://blog.csdn.net/wangxinginnlp/article/details/77717570

    tz February 21st, 2022 at 04:48 pm

    Simplification of English and Bengali Sentences for Improving Quality of Machine Translation 简化句子

    tz February 21st, 2022 at 04:47 pm

    最大的孤独是什么样呢?我想大概是,对于我们普通人来说,我们完成了很多事情,但最后这些事情似乎注定都是会被遗忘或者说无人提及,我们一无所有的来,也终将一无所有的去,所以在现实生活中,孤独它就是不可避免,无论是在人生的哪个阶段,总会有那个阶段的孤独,所以呢,我们才需要不断的去折腾,去不断的经历,无论成功与否,这些最后都会变成我们的记忆,而记忆无疑就是抵抗孤独最大的武器,因为回忆会比经历要长得多,每个村中都会有整日无所事事,独自坐在门口等待的老人家,什么能够拯救他们的孤独呢,唯有他们过往的记忆,还有对未来的期许,我们也是一样。好在我们今日的孤独没有建立在被列强侵略上,好在我们的孤独没有被一阵飓风抹去,好在我们的孤独是建立在今日强大的中国之上,

    孤独并不可怕,可怕的是我们穷极一生没有按照自己的想法做过什么,可怕的是我们从来不敢独自面对孤独,可怕的是我们以为自己从不孤独,虽然孤独不可避免,但生活会因为大家的负重前行而变得更好。

    tz February 21st, 2022 at 04:47 pm

    人在低谷时,勿扰人。人在高处时,勿戏人。人在相遇时,勿算人。人在离别时,勿毁人。人在争取时,要像人。人在放下时,才是人。

    tz December 27th, 2021 at 05:53 pm

    心理神经入门数据:
    The Psychology Book:Big Ideas Simply Explained
    Psychology: Everything you Need to Know to Master the Subject
    Psych101
    neurocience 戴尔•普尔夫斯 涵盖了神经科学各个领域基础知识,信号如何发生,如何影响机能(记忆,睡眠,正常行为,情绪等)
    Neurocience: Exploring the Brain Mark F.Bear 简单直接用神经科学解释一些行为
    Principles of Neural Science 神经科学入门圣经,全面,深入

    tz December 17th, 2021 at 10:31 am

    伊利 还记得吗

    tz December 13th, 2021 at 11:16 am

    梦到投放炸弹,我就看着天上,不停的躲。持续了很长时间。最后说这是在演电影,我就在想,那么他们怎么死了,而且那种冲击感,很真实。我被振飞了好几次。

    tz November 18th, 2021 at 12:20 pm

    Google 2020 blog machine learning:
    数据挖掘
    Large Scale Parallel Document Mining for Machine Translation
    Effective Parallel Corpus Mining using Bilingual Sentence Embeddings
    Denoising Neural Machine Translation Training with Trusted Data and Online Data Selection(去噪)
    模型
    Learning a Multi-Domain Curriculum for Neural Machine Translation

    tz November 1st, 2021 at 11:08 am

    天气慢慢变冷了。

    tz October 26th, 2021 at 11:16 am

    字符级的机器翻译,在embedding上面加一层一维卷积,会获得较好的效果(why dont people use character-level machine translation)

    tz October 22nd, 2021 at 06:19 pm

    论文清单:
    Training Tips for the Transformer Model
    Semantics-aware Attention Improves Neural Machine Translation
    Why don’t people use character-level machine translation?
    MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators
    Well-classified Examples are Underestimated in Classification with Deep Neural

    tz October 21st, 2021 at 04:51 pm

    语言学领域内每一次重大的范式
    转换大都由该学科领域对基本数据看法的改变而引

    tz October 13th, 2021 at 04:17 pm

    今天测试发现 DeepL在测试的较多句子,翻译通顺,语序很好。

    tz October 13th, 2021 at 11:18 am

    in a couple of days

    tz October 13th, 2021 at 11:18 am

    机器翻译如何实现数字精确和模糊的翻译,如数字千钧一发,123.1

    tz October 9th, 2021 at 05:33 pm

    AAAI 2018 Translating pro-Drop Languages with Reconstruction Models。针对口语领域,代词缺失的情况,引入Reconstructor(类似于auto-encoder),重新生成原文,同时也在decoder端加入reconstructor。

    tz October 9th, 2021 at 05:28 pm

    TACL 2018b Modeling Past and Future for Neural Machine Translation 机器翻译中解码器的状态需要承担三个任务,past,present,future,作者将past和furure的任务分离出去单独建模

    tz September 29th, 2021 at 05:42 pm

    https://github.com/lena-voita/the-story-of-heads 可视化机器翻译开源

    tz September 29th, 2021 at 05:41 pm

    Facebook 开源https://github.com/facebookresearch/UnsupervisedMT/

    tz September 29th, 2021 at 05:39 pm

    Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation 可视化翻译

    tz September 29th, 2021 at 05:06 pm

    Revisiting Low-Resource Neural Machine Translation: A Case Study 低资源翻译论文

    admin August 19th, 2021 at 05:52 pm

    Ehud Reiter 博客:https://ehudreiter.com/

    admin August 19th, 2021 at 09:42 am

    论文列表:
    Dictionary-based Data Augmentation for Cross-Domain Neural Machine
    Intelligent Selection of Language Model Training Data
    Chinese Syntactic Reordering for Statistical Machine Translation
    Train, Sort, Explain: Learning to Diagnose Translation Models
    Statistical Power and Translationese in Machine Translation Evaluation
    Neural Machine Translation with Reconstruction
    Learning Deep Transformer Models for Machine Translation
    Improving Deep Transformer with Depth-Scaled Initialization and Merged Attention
    Levenshtein Transformer
    REFORMER: THE EFFICIENT TRANSFORMER

    admin August 13th, 2021 at 12:05 am

    MT-reading-list是一个很好的资料,要坚持阅读翻译

联系方式

关于我

  • 来自北方的一个小山村,个性不张扬,要认真学习,开心玩耍。