Corporate IT Systems – Leveraging our Hybrid Cloud

The role and purpose IT plays in business has evolved over the years but today, what we represent is evolving at a rapid rate. We now live in a digital world where people are growing up with technology that’s a part of their lives, people understand the role technology plays and how it can affect them both personally and professionally. Gone are the days where IT and tech are foreign concepts to the majority of business.

We have been used to holding the keys and making key decisions in business, keeping the walls up and everyone else knows the IT guys sit over there and do “stuff” but no one really knows what they do. At REA the corporate IT group has ensured we represent the opposite of this very persona that so many IT teams have been aligned with. Being in an Internet based company with tech savvy people everywhere, we are in a good position to build transparency between us and our customers (REA Staff). You have to take it with two hands and lead from the front. Simply put, “IT as a service” Continue reading

Not Zoolander: The other kind of modelling…

The Behavioural Communications & Analytics, Media & Developer, and IT Delivery teams, working closely with ThoughtWorks, have been working on an exciting project around behavioural targeting. This work was recently presented at the Big Data & Analytics Innovation Summit held in Sydney, where REA Group was proven to be at the forefront of analytics in this space.

Presentation: More Than Meets The Eye

At REA, there is a wealth of data at our disposal around visitor behaviour on site, such as: section(s) visited; time on site/section(s); traffic source; myREA status; return visits; agent interaction; saving OFI times; saving searches; saving properties; getting directions; social engagement; types of suburbs searched; search refinements (price, bedrooms, bathrooms, car spaces, land size); number of properties viewed; property types viewed; attributes of properties viewed; the list goes on…

Where it gets exciting is when we start to think about how we can use this information to predict something about our visitors that we don’t know, be it: demographics; the likelihood of purchasing a particular product or responding to a particular message; the likelihood of obtaining a desired home loan; or, something REA Group is particularly interested in understanding right now is whether they belong to any of our key consumer groups, such as first home buyers, investors, renovators, or vendors. First home buyers are the first cab off the rank to trial this approach. Continue reading

Pomodoro technique as a collaboration tool

We recently started using the Pomodoro technique in our development team. Pomodoro technique is a time management method that specifies working in 25 minute blocks with short breaks in between. A 25 minute block is called a pomodoro.

We have adapted it a little for our purposes. We work as a team in synchronised pomodoros and then have a mini-standup after each. Each week we assign a pomodoro master that is responsible for managing the process – start pomodoros, keep time, count the completed pomodoros, etc. Continue reading

Testing interactions with web services without integration tests in Ruby

Our team decided to move to a micro-service architecture, and we started wondering how we would test all of our integration points with lots of little services without having to rely on integration tests. We felt that testing the interactions between these services quickly become a major headache.

Integration tests typically are slow and brittle, requiring each component to have its own environment to run the tests in. With a micro-service architecture, this becomes even more of a problem. They also have to be ‘all-knowing’ and this makes them difficult to keep from being fragile.

After seeing J. B. Rainsbergers talk “Integrated Tests Are A Scam” we have been thinking on how to get the confidence we need to deploy our software to production without having a tiresome integration test suite that does not give us all the coverage we think it does.

Continue reading

Automated Schema Migration in a MySQL cluster

The PSeeker Database

REA stores listing and agency data for Australia in a MySQL database named PSeeker. This large, complex database plays a central role in REA’s business:

  • About 95 tables in use
  • Largest table has 38 million rows
  • 24 tables have over 1 million rows
  • Near 100% uptime required

PSeeker in production is a loose cluster of ~10 MySQL database servers, that play a variety of roles:

  • A single active, writeable master instance that runs in our primary data center.
  • Several replica slaves in the primary data center used for read-only application load.
  • A replica in our secondary data center for disaster recovery
  • A replica used for investigation by support staff
  • A replica that feeds into our data warehouse

We use MySQL statement-based replication between the master and it’s replicas. This essentially runs the same statements on the replicas as has been run on the master.

Typically, replicas in the same data center as the master will be running behind changes in the master database by less than a second. More distant replicas can be up to a minute behind, depending upon the rate of updates and the bandwidth between the servers.

Manual Schema Management

Schema changes, such as adding columns or new tables, occur as new products are built or legacy systems are upgraded. They originate with the development teams involved in building or upgrading applications, at an average rate of 5-8 per month.

Continue reading

Our journey from Ruby 1.8.7 to 1.9.3

Recently REA made the move to Ruby 1.9.3 from Ruby 1.8.7 for our listing administration tool, a large Rails application used by realestate agents to manage their listings. The endeavour was ultimately successful, but not without significant challenges.

The most notable of these was a nasty segmentation fault. This fault at one stage caused so much pain that we believed that sharing our discoveries was necessary, in the hope that we might be able ease the pain for someone else.


Like all projects, the upgrade had a number of inherent challenges and restrictions. We had to fit it in between major project priorities. We also had to make sure that one of our shared libraries, which describes common domain objects for several other internal Rails applications, maintained compatibility with 1.8.7.  This effectively meant that the build pipeline needed to create artifacts for both 1.8.7 and 1.9.3, and the source had to be compatible with both.

Continue reading