Using the state-of-the-art NLP (Natural Language Processing) our Recommender System connects content across our platforms based on contextual similarity and common readers. Along with stunning, almost human-level, content understanding our system is capable of dealing with huge number of requests and users thanks to its scalable cloud-based architecture and GPU deployment.
Having the huge number of human-written texts, coming from various sources and covering different topics and styles of expression, our system has a considerable advantage over other competitive solutions. This precious textual assets along with state-of-the-art machine learning methods and the first regional distributed system that runs on GPUs, our Recommender is boosting traffic and ensuring that relevant content is reaching our users. The system is developed completely in-house from scratch to ensure that cutting edge technologies are used both in terms of machine learning and infrastructure/deployment. Natural Language Processing models are actually deep neural networks able to understand high-level abstract concepts from text which allows powerful matching in real time, even if there are no many common words among texts as long as they are related to the same subject. Models are regularly retrained to ensure the current and relevant topics are covered. Along with contextual similarities, system takes into account various different parameters, such as ageing of the content, length of the article, popularity etc. System yielded significant CTR increase and visibility, especially for the content from smaller niche portals and their specialised content that would otherwise stay hidden from most of the users. In this way, entire ecosystem works in an orchestrated way increasing traffic on all portals, boosting up the smaller ones and enhancing the entire user experience.
Integration with the service is easy and straightforward. We can include 3rd party content providers in our ecosystem or implement a stand-alone Recommender instance for the client.