It’s All About The Data
So people keeping asking me about my new gig, and it’s really tough to explain, other than saying “I work with data. It’s all about the data.”
It’s actually a really good fit for me, working with data in this way, and although it’s only been a matter of months, I am really enjoying this new gig. But what does it actually entail?
Well, first off, let me note (again) that I am not permitted to mention the company name on a private blog or Website, so I will refer to us as XYZ. The company is fairly well known by its acronym, and one of the letters in its name is H, for Health. Basically, the company deals in Health related data.
Of course that is a very broad topic. More specifically, XYZ focuses on two areas: Health information (more specifically, information about medical prescribers, including license numbers and statuses, DEA rights, sanctions, and other key attributes about those prescribing medications to Americans), and MDM – Master Data Management – as it relates to that health data.
So XYZ maintains its own database, culled over the years with thousands of data sources. Basically it starts with a prescriber and an address/phone/fax. This lets XYZ separate out Dr. Jones from Dr. Jones, though in the case of family practices this does, admittedly, get more difficult.
By looking at variants on the addresses, comparing the data sources for those variants, and the other available information (gender, birth date, school, etc.), XYZ is able to rank those addresses. So we might know that Dr. Smith is at 123 Main Street, Anytown OH, and also has an office at 345 Minor Avenue, Anytown OH. We can also tell that this Dr. Smith is different than the one at 125 Main Street, Anytown OH (at least to a reasonable extent).
Once that is known and you can create a unique identifier for a prescriber, the next step is to gather all of the other information you can about Dr. Smith. Where did he go to school, and when? What are his areas of specialty? What State Licenses does he hold to practice medicine? What are his various government-issued ID numbers – not only his NPI (National Provider Identifier), his UPIN (his Medicare practitioner identifier) and his DEA number (giving him permission to prescribe certain restricted drugs), but his Tax ID, NCPDP (National Council for Prescription Drug Programs) ID, and many, many more.
Add to that any sanctions (State or Federal, against the doctor, along with any fines, penalities and related dates), throw in all sorts of demographic info and stir up, and you have yourself a nice database. Add in organizational attributes – in other words, capture all of the same information for hospitals, medical practices, clinics and more – and tie the two together, and you have yourself a very powerful tool.
So that’s what XYZ does, data-wise. Clients often then want to incorporate their own data alongside the XYZ data, which is where the MDM (Master Data Management) comes in, building databases for regular delivery, updating the data these clients keep on hand for their own purposes.
And there are a number of ways clients use these data, much dependent upon what type of organization they work for.
There are several vertical market groups at XYZ, and I work for the Pharmacy, Pharmacy Benefit Management and Government group. I am not a sales person, nor am I the technical person. My title is Lead Business Analyst and, again, my job basically comes down to the data.
First I sit with the customer to determine what data they need. I am generally taking things over after the client has signed their contract with the sales person, who has whittled the customers’ requirements somewhat, giving me a general ballpark. But I will sit (alongside the sales rep and, of course, my bosses) with the client to determine, in excruciating detail, what exactly the client will receive on a regular basis (and how frequently). This means outlining every day column, from what to call it to how to filter the data to how to deliver it. As most customers have fairly automated processes in place, this is not quite as straightforward as it might sound.
Once the Requirements are laid out in painful detail, I communicate with our Programmers, who turn that logic into an actual query (or queries) against our system. They have their own system which wraps up all the individual datasets as outlined in the Requirements, and a series of network drives exists solely to track every step in the processes, just in case.
I then take the data over again, and QA it every which way from Sunday. Sure, most of it is automated with scripts in SQL, Excel, Linux and other tools. But basically I need to raise the red flags to go back to Programming, or the data owners themselves, in case anything looks iffy. And if you have ever worked with large quantities of data before, you know that there is almost always something that looks iffy.
I’ll then deliver the data to the client within terms of the contracted SLA (service level agreement), of course answering any questions they might have. That might include questions about the data itself, or it might include consulting with them to determine what other data needs they might have – at which point I will forward them back to sales to upgrade the contract and restart the process all over again.
So what do Pharmacies, PBMs and the Government do with these data?
Well, in my group, Pharmacies are the biggest customer base. With almost all of the top chains around the country, plus many of the supermarkets and even some independent (or smaller, more local chains) pharmacies, I think the message is clear – avoid the fines and punishments instituted by the government for FWA (Fraud, Waste and Abuse).
It is basically up to Pharmacies to assure that they have done due diligence to verify that prescriptions they are filling have been checked to ensure that the doctor (or nurse or dentist or veterinarian) who has written the prescription actually has the right to do so. And while there are free databases available with this information, having a company focused on ensuring the quality of these data, including timely updates, has proven more than worth the cost to the client base.
Pharmacy Benefit Management companies (think the prescription benefits arms of insurance companies) use it for similar purposes, ensuring that they are only paying pharmacies back for legitimate prescriptions, and the government (primarily the agencies overseeing Medicare and other similar benefits) has very similar goals in mind.
As I said, I like it. I am not the “owner” of this data – other teams concern themselves with the data sources, with the data inaccuracies, and with the tools to manipulate these data – but I do have to not only thoroughly understand the data, I need to understand the various technical tools necessary to ensure my own control over what I am sending off to customers.
I might access the company’s own Web platform for smaller projects, looking up individuals to verify what I saw in the data. I use an internal data-mining tool which allows me to draw relatively simple reports, giving “sample” data sets to prospective customers (or the lower-paying clients, for that matter). But mostly I am using SQL (which I can use fairly well), and the company seems to be moving more towards a Linux architecture (with which I can create only the most basic of queries).
I am constantly pulling these data into Access or Excel then to check it out, and most importantly, to document every single step I am taking with the data, should questions arise, as they invariably will at some point far enough in the future that I will have forgotten what I actually did.
It’s an odd experience for me, but a good one. At Thomson I was in a role with a title of “Marketing”, more often than not. It’s still one of my favorite experiences there, sitting with a group of the tech guys and being introduced as, “despite his title, Doug is actually a data-wonk, like us”. At WK I was definitely working with data, but it was a very specialized dataset, and as it had related to my time at Thomson, it didn’t necessarily speak to my data chops. And at the government, I was basically “the tech guy”, a role that I was able to fulfill to an extent, but which wasn’t really my bailiwick.
Maybe I will get into this more in the future. (I would really love to analyze the data, pulling out some interesting tidbits about it, much in the way I did with those bibliometrics analyses I had performed.) Until then, be assured that if you remember nothing else about what it is that I actually do for 40 hours every week these days, remember “it’s all about the data”.
- Amazon.com: Master Data Management and Customer Data Integration for a Global Enterprise (9780072263497): Alex Berson, Larry Dubov: Books (serve4impact.com)
- Master Data Management to Grow in 2012 (arnoldit.com)
- New Study Finds E-prescribing Is Safe and Efficient, but Barriers Remain (ahrq.gov)
- Master Data Man… (lisammar10.wordpress.com)
- Back In The Hunt – A New Approach To The Job Search (dougnewmanpro.wordpress.com)
- Can You Spare Some Change? (dougnewmanpro.wordpress.com)
- Week One – A Journal (dougnewmanpro.wordpress.com)
- Life In The Fishbowl – Nasty, Brutish And Short (dougnewmanpro.wordpress.com)