Embeddings are a really neat trick that often come wrapped in a pile of intimidating jargon. If you can make it through that jargon, they unlock powerful and exciting techniques that can be applied to all sorts of interesting problems. post
Google Research put out an influential paper 10 years ago describing an early embedding model they created called Word2Vec that takes single words and turns them into a list of 300 numbers. arxiv
That list of numbers captures something about the meaning of the associated word. This is best illustrated by a demo. github
demo
wiki ⇒ No vector for that word. Try another.
ward ⇒ elliott, bailey, miller, wheeler, shields, wyatt, anderson, wilson, thompson.
creative ⇒ artistic, ideas, marketing, clever, imagination, visual, talents, design, talented.
pattern ⇒ patterns, norm, phenomenon, chart, thread, behavior, theme, habits, scenario.
.