Infographic: Customer-Speak, Engineered for Scale

From dresses and bras to rugs and skincare, and everything in between, Lily AI helps retailers better understand what consumers want and helps shoppers find exactly what they need.

This week, The New York Times published a feature highlighting Lily AI’s impact on retail. We often get asked to demystify our AI and quantify our current scale, which of course, we are happy to do! The infographic below reflects recent volumes and examples of products we’ve enhanced to-date, yet before we dive in, let’s start with the basics.

It all starts with understanding “consumer-speak.” This includes colloquial search terms, attributes, occasions, trends, and synonyms; essentially, a diverse array of conversational language that real people–not merchandisers or fashion insiders–use when they talk and shop. In retail, these terms translate to “product tags,” the product descriptors which are the foundational elements of what makes a product unique.

Over the past 8 years, Lily AI has developed a product taxonomy–an industry-specific language library– of 20,000 words used by consumers to describe fashion, home, and beauty products. Lily AI then integrates this product taxonomy with AI, specifically computer vision, natural language processing, machine learning, and vertical-specific large language models. This powerful combination enables Lily AI to process millions of product images and text, and automatically generate the most comprehensive and accurate set of tags for every product.

From dresses and bras to rugs and skincare, and everything in between, Lily AI helps retailers better understand what consumers want and helps shoppers find exactly what they need.