I’m working on a artificial intelligence question and need an explanation to help me understand better.Embeddings of words are often vectors of numbers capturing the contexts inwhich a word occurs. There are two types of word embeddings exemplified by
(1) aword can be represented by a vector representing the frequency of other termsoccurring nearby or
(2) a word’s word2vec skip-gram embedding. Provide at leasttwo disadvantages of using the former in comparison to the latter.
Requirements: a few sentences to a paragraph | .doc file | Python