…�������������� 245 Rafal Rzepka, Shinji Muraji and Akihiko Obayashi / Utilizing Wikipedia for Retrieving Synonyms of Trade Security-related Technical Terms ����������������������������������������������������������������������������������������������������������������������������…
label publication date main subject Circadian patterns of Wikipedia editorial activity: a demographic analysis 2012 Wikipedia circadian rhythm demographics English Wikipedia academic studies about Wikipedia Accuracy and completeness of drug information in Wikipedia: a comparison …
…r Emacs emacs-w3m-1.4.632.20210106.2144nb1 Emacs frontend for w3m browser emacs-wiki-2.72nb2 Wiki major mode for emacs, which aims for implicit and natural markup emacs20-20.7nb26 GNU editing macros (editor) emacs20-elib-1.0nb5 Library of utility functions for Emacs emacs21-21.4a…
… Proceedings of the First Workshop on Advancing Natural Language Processing for Wikipedia 13 papers Proceedings of the Eighth Widening NLP Workshop 1 paper Proceedings of the Ninth Conference on Machine Translation 134 papers Proceedings of the 6th Workshop on Narrative Understan…
…word was chosen using semantic vectors created from a large Polish corpus (nkjp+wiki-forms-all-300-cbow-ns, http://dsmodels.nlp.ipipan.waw.pl) created for the CoDeS project (http://zil.ipipan.waw.pl/ CoDeS). First, 800 words with the highest cosine similarity scores with each tar…
… we use the original 16GB and target languages, LSTM language models lever- BookWiki corpus (the Toronto Books Corpus, Zhu age the latent hierarchical structure of the input to et al. 2015, plus English Wikipedia) from Liu et al. obtain better performance than a random, Zipfian (…
A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter . At least 80 million (3.3%) of Wikipedia's facts are inconsistent, LLMs may help finding them A paper titled "Detecting Corpus-Level K…
A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter . At least 80 million (3.3%) of Wikipedia's facts are inconsistent, LLMs may help finding them A paper titled "Detecting Corpus-Level K…
…beres: a historiografia digital e colaborativa no projeto Teoria da História na Wikipédia Revista Brasileira de História , 2020 Entre os projetos colaborativos de acesso gratuito de organização do big data destaca-se a Wikipé... more Entre os projetos colaborativos de acesso grat…
…ry 22, 2018). See “How do you define Humanities Computing/Digital Humanities?” (wiki page of the University of Alberta, last updated March 16, 2011, <http://www.artsrn.ualberta.ca/tapor wiki/index.php/How_do_you_define_Humanities_Computing_/_Digital_Humani ties%3F> [accessed, Aug…
…Vol. 29, No. 1, (2020), pp. 1664 - 1680 KLD, their proposed methods (QE(MT), QE(Wiki), QE(PubMed)) that are depending on term selection using a linear regression model which its features are extracted from MT, Wiki, or PubMed. In KLD method, the top 7 ranked documents of CLEF col…
… as an object concept description. Additionally, they proposed a combination of Wikipedia and dictionary data to compose action class descriptions using human supervision in this task. Hence, they could identify objects in videos and provide a representation based on their concep…