One of those most challenging things that a health plan can do is migrate from one core administration system to another. With many health plans still running legacy platforms (PowerSTEPP, PowerMHC, Amisys, and the like) these migration projects are going to continue to present new challenges across the industry for years to come. Many plans are choosing to migrate to newer solutions like HealthRules Payer, QNXT, or Conduent’s HSP platform.
There is perhaps no greater risk to the success of a core claims migration project than data conversion. Legacy systems have data structures that do not capture all of the data elements modern systems require. Older platforms were often developed before HIPAA regulations were in effect. They’ve often been heavy customization to make them work in today’s world. They may be augmented with external processes and systems. Or they may simply not be working very well at all, adding to the impetus to migrate away from them.
Provider data poses significant challenges during a claims system migration. Since provider data comes in from a variety of sources, this can lead to missing and incomplete provider data. That will very quickly bring overall conversion efforts to a halt. Provider data issues ripple down to member, claim, and authorization conversion efforts.
What can a health plan do with its provider data before and during a claims system implementation project to ensure success? Let’s talk about four best practices that can save your health plan time, money, and frustration.
Best Practice #1: Proactive Provider Data Quality Analysis
As soon as you begin to plan your system implementation, you should begin a provider data quality analysis project. This will help you to determine where your challenges are. You can then plan for how to address those data challenges. After that, you can identify tools or resources that will set your system implementation up for success.
While it helps to know what system you will be migrating to (QNXT, HealthRules Payer, Facets, etc.), it’s not a requirement. A good provider data quality analysis will leverage industry best practices in a system-agnostic manner. This will give you a holistic view of your data quality. From there you can perform a root cause analysis to determine how to clean up and maintain data during your implementation.
Common Provider Data Quality Issues
There are some essential data quality measures that you can analyze right away to get a sense of your provider data quality. These would include:
- Ensuring that you have valid Tax IDs and NPIs
- Validating that provider addresses are up to date and accurate
- Capturing accurate provider names
- Verifying that provider specialty data is correctly captured in your current system
You should look at situations where you have in-network providers associated with non-par contracts and vice versa. It’s a good idea to flag any providers where you may have the wrong NPI on file. This can often be determined by matching the names and NPI types on the provider records.
Pay particular attention to provider addresses. It’s recommended that you use tools to validate against USPS data. At a bare minimum, you want to ensure that you have valid city, state, and zip code combinations. Many systems may allow for “invalid” addresses, but they will reject records where the city, state, and zip code do not match. This is important for claims payment and provider directories. Providers may be reimbursed at different rates based on location. Having an incorrect location on file can also impact regulatory reporting and provider directory accuracy.
It may be tempting to focus on just in-network providers. But remember that implementing a new claims system often means bringing over non-participating providers that have had any claims activity within a given period. You should identify high-volume non-par providers and include their data in your analysis project. This could mean the providers that account for the top 80% or so of your out-of-network claims.
Best Practice #2: On-going Provider Data Quality Review
Remember that your provider data is going to change daily. Providers will change medical groups and locations. You will see new non-participating providers come in on claims, which must then be set up in the system. Providers will retire, contracts will terminate, and occasionally a provider will be sanctioned.
Once you’ve selected the claims system that you will migrate to, you’ll start to get visibility into the data required to migrate to that new platform. This may require an evaluation of your provider data using additional rules to ensure that your data will load into the new system.
It is recommended that you review the results of your initial data quality analysis so that you can perform critical fixes. As you get into the system implementation project, you should continue to perform ongoing analysis. This can be challenging when your team members start to get pulled into various project activities. But it is important to re-evaluate your data regularly. We recommend performing this analysis every month during the project to stay on top of data clean-up. Then perform a final analysis and clean-up two weeks before you go live with the new system.
Best Practice #3: Automate Provider Data Clean Up
Many issues with provider data can be automatically cleaned up with relative ease. Using various data sources and clean-up techniques, you can fix many of the most common provider data issues quickly without manual intervention. This may require a small project with your IT team to load the clean data into the existing system. Automating the clean-up as much as possible will make it more feasible to monitor data quality throughout your implementation. It also helps keep the pressure off your team members so they can focus on the project. It can save you substantial time and money by helping you avoid the need to ramp up temporary staff to clean up the data within the time constraints of the project.
You should clean the data in your legacy system if possible. This will make your data conversion work much more straightforward and manageable. However, the legacy system may have significant limitations. It may be difficult or expensive to integrate with or the resources may not be available to build data loaders to get clean data in.
Some health plans have employed robots to take the results of their data cleanup efforts and update legacy systems. This can be a relatively straightforward way to ramp up more “muscle” without having to deal with legacy technologies.
Another approach is to handle the cleanup through your data conversion process, which is visualized below. Data can be extracted from your legacy system, scrubbed, and then loaded to a clean conversion database. The conversion database can then be used to transform the data for the new claims system. Think of it as “dialysis for your provider data” if you will.
Best Practice #4: Choose Your Battles Carefully
It can be easy to back your team into a corner when it comes to cleaning provider data. It will be an ongoing battle that will outlast the system implementation itself. Remember that “perfect” is often the enemy of “good enough”. Pick the issues that will be true showstoppers and focus your attention on those. If you’re able to automate the cleanup of other issues then that should be considered a bonus.
This can mean several different things from the perspective of system migration.
First, establish reasonable metrics for successful data conversion. You may choose to split these metrics between PAR and NON-PAR providers. If you’ve seen very little claims activity from a particular non-par provider, you may decide that it’s not critical to even load that provider. In your new operating environment, you will need to have a process for handling non-par setup. So if you get a claim from those providers in the future, you can set them up fresh. On the other hand, PAR providers generally need to be in good condition within the system. This is especially true if you plan to generate provider directories from the claims system.
Next, review your target system’s rules for provider data and determine which ones are showstoppers. Focus your efforts on addressing those issues. For example, many systems prefer valid US addresses but they require a valid city, state, and zip code at a bare minimum. The latter can often be cleaned up in a more automated fashion, whereas the former often requires research or provider outreach to fully correct. In this case, you can focus your energy in one specific area, take a significant burden off of your team, and mitigate a costly risk to the implementation project.
Bonus Best Practice: Get Professional Provider Data Quality Support
Don’t head into such a major undertaking without having the right people and tools in your corner. Right from the beginning, you should look to engage professional assistance in evaluating and correcting your provider data.
Decipher Solutions offers just such assistance. We recommend that you engage our qualified professionals to:
- Guide your data conversion approach based on decades of experience in the space
- Perform an initial evaluation of your provider data quality and make recommendations for processes and tools to get you on the right track
- Implement our Maven One Rules Engine (MORE) to automate the analysis and scrubbing of your provider data to keep your project on track
- Help you choose your battles wisely, focusing on pragmatic needs of the program and organization while systematically improving all aspects of your provider data management
Contact us today for a no-obligation initial consultation and a customized provider data best practices report to help you get started on the right path.