Entropy gleans its insights from deep NLU (subtopic in NLP) analysis of written communication.

It extracts the following Emotional IQ signals from text:

  • Positive
  • Negative
  • Emotionally Charged
  • Verbose
  • Unsure
  • Informal
  • Passive Voice
  • Runon
  • Profane
  • Aggressive
  • Concise
  • and, as a surprise, a few others…


Try Me

API Examples:

languageML('Hello, we are on a mission to change written communication.')
## [1] "Concise"        "Informal"        "Receptive" "Positive"
languageML("Don't get us wrong, we love existing tools such as Gmail and Slack. Yet, we believe that our communication is poor (Emojis - come on...). We are capable of much more!")
## [1] "Thoughtful" "Informal"   "Verbose"


Places where linguists traditionally look to see [entropy] are not where the fundamentals of language are.

Marcelo A. Montemurro and Damián H. Zanette

comments powered by Disqus