How to Scale Different Data Models
Architecting for Scalability and Elasticity
Date: This event took place live on March 02 2017
Presented by: Greg Meddles
Duration: Approximately 60 minutes.
Questions? Please send email to
Watch the webcast in its original format.Sign in to Register
Achieving scalability and elasticity is a huge challenge for data architects. Data just isn't small, neat, and orderly anymore. While all database vendors say they scale big, what if you want to scale to handle over 6,000 queries per second? Once you are processing data at this speed, how do you handle bottlenecks exhibited under sustained or bursting loads? If you are running into these issues, it might be because you are adding unnecessary steps in the process.
In this webcast, join Greg Meddles, Sr. Principal Consultant for MarkLogic, to learn how complex applications and systems must be designed to scale horizontally — both at the software and the hardware level. This concept of "scaling through parallelism" is crucial for performance. Greg will also discuss how to maximize your use of data models you already have, to avoid converting them for every individual query.
You will learn:
About Greg Meddles, Senior Principle Consultant – MarkLogic
Greg Meddles thrives when solving difficult problems, and has spent the last 5 years delivering solutions with MarkLogic. As the technical lead for MarkLogic at healthcare.gov since September 2013 there have many opportunities to thrive, particularly in the area of system performance and scalability. Greg brings implementation experience as both a consultant and product engineer for customers ranging from the financial sector, Intelligence Community, and healthcare. Greg has an interest in semantics, artificial intelligence, and natural language understanding. Greg holds a Bachelor's Degree in Mathematics from Birmingham-Southern College.