A software development house specialising in hyper-scale repositories for structured and unstructured content

About Us

Yell Research is a software development house specialising in the development of business solutions that manage both structured and unstructured data at hyper-scale.


We focus on using cutting-edge open source platforms and frameworks to solve problems for our customers. We have successfully built and deployed solutions for customers across Asia-Pacific.


Our services include:

» Implementing hyper-scale content repositories - of the order of several billion items

» Deploying solutions in the Cloud: AWS, Azure and GCP

» Building applications on the Alfresco Content Services platform

» Developing Content Services: document, digital asset and archive management


Our Projects

In a digital world, content is at the centre of much of what we do. We solve interesting problems that impact people and improve how they work. Since 2010, we have worked to deliver over 40 successful projects to our clinets. We serve leading enterprises across Asia-Pacific including:


» Banks, insurance and finance

» Professional services

» Public & cultural institutions

» Private enterprise


Some of our recent successes include:


Health Care Provider

Yell Research teamed up with its partner Lateral Minds to develop and implement a hyper-scale content repository in the Cloud. Over 10 billion content items are planned to be stored and managed in the repository.


Yell Research worked with Lateral Minds to create a benchmark of 10 billion item-repository to prove that a data set of this scale is operational and manageable.


How "big" is a 10-billion-items repository? It's big - a big-data problem. Let's say one can ingest 1,000 items per second, it would take 10 million seconds or 2,778 hours or 116 days to build a repository of this scale.


The system we developed utilised partitioning and paralellism to solve this problem. Our team has the know-how to design and build a repository of this scale and benchmark it in a couple of days - not months.


Government Department

We developed and supported the design and implementation of a major Enterprise Content Management Platform to enable the creation and collaboration of legal documents for over 600 users.


What does it take to keep such system up and running 24 x 7? Our team knows.


Cultural Institution

Yell Research developed technologies to enable a major art museum to digitise and publish their art collection to the public worldwide.


As well managing complex taxonomy, a work of art can be "published" in many renditions. For example, a statue can be viewed from many angles and different resolutions - each being a rendition.


The system our team built manages this content-rich application. And we deployed and run the system in Microsoft Azure - in Containers for scalability and performance.

Working at Yell Research

Best Practice

All our efforts - from DevOps to product development to system administration - aim to be based on best practices. There's a very good reason for everything we do - and we expect the same from everyone who works with us.


Software Engineers. Not Cowboys.

We aim to be the best software development team - we beleive that it takes a co-operative and disciplined effort from everyone on board to deliver a quality product. Our Being a "cowboy" is not sustainable and will not get us there.


Quality over Quantity

We don't want to just hire and fire. We want to train and retain. We are looking for quality hires - hence our recruitment process is a bit longer and a bit more involved than others you may have come across. We see it as a two way street where you get to find out as much about us as we do about you.


Cultural Fit

If our culture sounds like it's in line with your ideals, and you'd like to see if you're a good fit for one of the roles we have below, please email us at careers@yellresearch.com

Contact

For general questions please fill out the form below.

Oops! Please correct the highlighted fields...

Thanks! I'll get back to you shortly.