Assessing and Improving Routine Food Inspection Report Completeness

State: IL Type: Model Practice Year: 2015

:

To standardize a process for how routine food inspection forms are filled out, the Environmental Health (EH) staff met and developed criteria that led to the creation of a checklist to use to review inspections for consistency.  Previously there was not a system developed to standardize how the forms were being filled out, so the project created an opportunity to improve the quality of reports being given to food establishments, increased standardization among inspectors, and increased food safety by providing more detailed and completed reports for the food service owners and operators.  Inspections audited from February of 2013 found that 42% of inspections were in compliance when using the newly created checklist.  Inspections were initially audited by the supervisors using this checklist, but to limit subjectivity the supervisors met to test their internal consistency in how the checklist was being used to review inspection reports.  Based on this initial analysis, the group decided to implement the Plan-Do-Check-Act (PDCA) process.

The entire Environmental Health Section of nine Environmental Health Practitioners, two Program Supervisors, one Administrative Assistant, and one Assistant Director were involved in the process.  The Health Data and Quality Coordinator, who is not part of the Environmental Health Section, was involved in the process for non-biased facilitation in some of the activities that took place.  All team members had an active role in the discussion, design, and implementation throughout the PDCA process.  From the results of the February baseline data an Aim Statement was created: By 05/13/2013, the EH Section will see an increase in the percentage of completely written inspection reports from 42% to 80%.

On 02/13/2013 the EH staff were anonymously surveyed by the Health Data and Quality Coordinator regarding how often they fill in each of the required fields on the inspection report.  EH staff then each completed flowcharts to indicate their individual processes for completing inspection reports.  Both tools showed variability in the procedures among the staff members.

To determine the root causes of the problem the EH staff members conducted a Cause and Effect Diagram during a meeting on 03/05/13.  This process was also facilitated by the Health Data and Quality Coordinator and management did not attend the meeting so that staff could share their thoughts and concerns to a non-biased representative.  Based on the result of the Cause and Effect Diagram, some of the root causes determined were inconsistency in assessment by the supervisors, pressures of time and workload, and not enough group collaboration in defining what a completely written inspection form is.

On 03/13/13 the EH group talked about best practices around how inspection reports are written and looked at potential solutions to ensuring completeness of inspection reports.  The EH staff and management brainstormed potential solutions and created an Affinity Diagram to identify the best possible method of improvement.  Based on the Affinity Diagram results and previous discussions, the group voted and selected to create an Inspection Standardization Form.  This served as tool to use in the field in which EH staff had an identified list of what should be written on the inspection form and how it should be written.  The form supplied EH staff with concise guidelines for standard inspection documentation, and would be created collectively by the entire EH team.

In selecting the creation of an Inspection Standardization Form, the prediction was that if each EH Practitioner brought the guide and used it after each routine inspection, then the percentage of correctly written inspection reports would increase from 42% to 80% by 5/13/2013.  The form was created by the team to address the identified root cause of inconsistency and to ensure group collaboration, and the final version of the form was handed out to use between 04/13/2013 to 5/13/2013.  During this period each routine inspection was evaluated by the EH Supervisors using the inspection review checklist, the same version used to establish the February baseline data.  The supervisors also met regularly to ensure that the issue of inconsistency among supervisors was addressed, which was another concern that came out in the root causes analysis.

Because the team anticipated that improvements may be seen just by identifying and working through the PDCA process, data was collected from February 2013 until the end of the PDCA cycle.  The data was collected and analyzed by the two EH Supervisors.  Bar charts created showed monthly results for each EH Practitioner and a group average based on the percentages of violations written correctly, percentages of forms filled out correctly, and percentage of completely written reports.  Bar charts were created for February, March, April, May (May 1-13), and from during the implementation period of April 13-May 13.  A line chart from February 2013 to May 2013 demonstrated the percentage of completely written inspection reports.  Also, individual line charts for each EH Practitioner showed weekly the percentage of completely written inspection reports throughout the entire PDCA process.  Trend lines were put into these graphs to show an average positive or negative trend.  All individual data was displayed anonymously.

Data showed an increase in completely written inspections from 42% in February to 75% by end of the PDCA cycle (05/13/2013).  The data showed increases by month in average percentages of correctly written violations, forms, and completely written reports.  Individual data also showed increases by every Environmental Health Practitioner, though variations in the degree of improvement.  Follow up occurred with individuals who did not reach the desired mark and decreased the overall group average, whereas some individuals had 100% compliance by the end of the implementation period.

While the improvement did not reach the desired goal of 80%, the increase from the baseline of 42% to 75% at the end of the PDCA cycle was deemed a success by the team.  On 07/06/13 the team evaluated the Inspection Standardization Form via a SWOT analysis.  The analysis revealed an increased level of consistency and team collaboration, but the team felt the development process was time consuming.  The SWOT also identified opportunities for new projects.

To standardize the improvement, the Inspection Standardization Form is now standard practice and serves as a tool that EH Practitioners use during their inspections.  The form has also been implemented into the process for new employee training.  To sustain the gains, the EH Section will continue to monitor this data on a quarterly basis as it is now part of the KCHD Performance Management System.  When the percentage decreases in completeness on the performance management dashboard, the EH management discusses the issue and then uses quality improvement tools to determine root causes or uses quality improvement tools to identify improvement strategies to ensure the desired goals are continually reached.

There were numerous future plans that arose throughout the PDCA process, such as creating a future PDCA around what is considered a “correctly” written violation, possible changes to the current inspection form being used, and the project has been a driving mechanism towards digital inspections in the future.  The project has also lead to the creation of another PDCA in 2014 around assessing and increasing the rate of Educational In-Service inspections at food service establishments.  The project was published through the Public Health Quality Improvement Exchange (PHQIX) in late 2013 and was a featured webinar on PHQIX in 2014.  To share our success so that others may learn from our process, the project was presented at the Illinois Public Health Association Food Safety Symposium on September 2-4, 2014.  Even though the Aim Statement of the PDCA was until 5/3/2014, the process has become standard practice and the project continues to be evaluated and address through the use of the performance management system and quality improvement methods.

 

:
Kane County Health Department
:
Assessing and Improving Routine Food Inspection Report Completeness
Located 40 miles west of Chicago, Kane County is home to 515,269 people who reside in 30 municipalities. Kane is ranked as the 5th largest by population of 102 Illinois counties and the 51st largest by area (522 square miles). By 2040, the population is expected to reach 800,000, with corresponding growth projected for households and jobs (53% and 64%, respectively). The demographic profile has changed dramatically in the past 2 decades. The 2010 U.S. Census reports that the Hispanic population has tripled since 1990 and now stands at 158,390, or 31% of the total population, the highest proportion of Hispanic residents of all Illinois counties. Kane County is notable for its age distribution. The median age in Kane is 35.4 years (national median age of 36.7 years), and the fastest-growing segment of the population is 55–69-year-olds. Children younger than 18 years make up 34% (174,763) of the population. Because no standardized system was in place, there were variations in how inspectors were filling out inspection reports. Inspectors are trained on how to recognize violations during an inspection, but there wasn’t as much focus on the standardization of the writing of the report itself. Creating a standardized and completely filled out inspection report could lead to higher quality, increased consistency, and reductions in unnecessary follow-up inspections. It could also help achieve grant requirements when the health department is reviewed by the Illinois Department of Public Health. Giving the food establishments a higher-quality report can lead to increased education and understanding of the violations, which can lead to better food safety practices and a possible reduction in foodborne illnesses. The need for a quality improvement (QI) initiative was first determined in an internal meeting among the EH Section, in which the team identified important criteria on an inspection report. Those criteria were used to create a checklist to use to audit inspection reports for completeness. An audit of all routine inspections conducted in February 2013 showed that only 42% of inspections were in compliance using the newly created checklist.  The goal was that by May 13, 2015 that the percentage of correctly written inspection reports increased from 42% to 80%. While the end of the implementation reached a 75% compliance rate, which was not the 80% goal, the project was deemed a success as every single member of the EH staff saw a positive trend and some members had 100% compliance by the end of the implementation period.  The data showed increases by month in average percentages of correctly written violations, forms, and completely written reports, which were all various measures of success during the implementation period and the entire PDCA process. There were several impacts from implementing the process.  Some of the future plans that arose throughout the PDCA process were creating a future PDCA around what is considered a “correctly” written violation, possible changes to the current inspection form being used, and the project as a driving mechanism towards digital inspections in the future.  The project has also lead to the creation of another PDCA in 2014 around assessing and increasing the rate of Educational In-Service inspections at food service establishments.  The process is also a standardized practice and is used by all inspectors.  Because it is a standardize practice it has helped in training new staff coming in to the program.  Having an inspection form written with a higher quality has led to more consistent and detailed inspection reports for food service operators and owners, which can increased education and hopefully reduce negative food safety practices.  It has also led to reduced unnecessary follow up inspections and reduced time for supervisors to review inspection reports. The website for the organization is www.kanehealth.com  
The problem was that there is not a standardize process on how inspection reports are written based on the food service sanitation code.  Health officers are standardized on how to find violations in an establishment but there is variability on how reports are written.  Even various health departments may use similar forms and may look for similar violations, yet the way the report is written may vary.  Evidence suggests that an educational component is essential to increasing food safety practices and potentially reducing foodborne illness, which is why Certified Food Managers are required and why the state of Illinois has basic food safety requirements for food service workers.  By not recognizing this as an issue, or an opportunity for improvement, is a concern to how food inspections can be improved upon. Previously this was addressed by EH managers reviewing the inspection reports and working with individual inspectors if reports were written incorrectly or not completely.  This still is only an individual effort and does not address consistency among the EH team, which is why the opportunity exists.  When looking at this concern the health department was not able to find other organizations that had addressed this issue or built a system to improve upon this issue, which is why the health department saw this as a great opportunity to try a new approach and decided to implement the PDCA cycle to look for ways to improve upon this concern. A properly written inspection form has numerous public health benefits.  For one, the food service manager has a better understanding of the issue at hand.  Previous inspection forms may say “clean the prep table”, which doesn’t tell the owner or operator why this is a public health concern.  In this process violation completeness is considered correct if the violation states what the violation is, where it is in the establishment, why it is a public health concern, and what to do to correct it.  By providing more detailed information, the food service worker can understand why an issue is a violation and a threat to public health, which increases the chance the operator may correct the violation or even just appreciate the significance of the issues being addressed on the inspection report. Another benefit to a completely written inspection report is for quality assurance for the EH managers reviewing the report.  If part of the report is not filled out, the manager cannot know if the item was addressed during the inspection.  By not addressing the particular item, for example checking sanitizer concentration levels, even though the inspector does the inspection they may have missed a public health threat.  Having a completely written inspection report helps the inspector ensure they have addressed all items during their inspection, as well as the inspector reviewing their work.  Other benefits could be meeting grant requirements, easier audits by the Illinois Department of Public Health, and more detailed reports in case of any litigation or Freedom of Information Act (FOIA) requests. One could argue that this practice is new to public health, as many health departments standardize looking for violations but not the completeness and standardization of the inspection report itself.  When conducting research prior to the start of this project the health department staff could not find a similar project or practice which is why it could be looked at as an innovative practice.  The health department saw this as an opportunity to use quality improvement to take an existing process and create a new innovative practice.  The team decided to use the Plan-Do-Check-Act (PDCA) tool to help formalized the improvement process and used many QI tools throughout the process, such as surveys, flowcharts, a fishbone, numerous bar and run charts, brainstorming, an affinity diagram, a SWOT analysis, and other various tools. There has been evidence that increased education during inspections can help improve food safety practices and hopefully reduce foodborne illness, which was a driving force to why the project was implemented among the EH team.  There are numerous articles, publications, and studies indicating the effect of education increasing proper food safety practices.  A few examples are Effect of manager training program on sanitary conditions in restaurants by Cotterchio, Gunn, Cofflin, and Barry, Enhancing food safety culture to reduce rates of foodborne illness by Powell, Jacob, and Chapman, and numerous other articles.  Increased education and detailed inspection reports can also help address Healthy People 2020 objective FS HP2020-6 to increase the proportion of people who follow food safety practices.  By increasing better food safety practices and ensuring better enforcement and regulation through completed inspection reports, better food safety practices can ensue.  The evidence itself can also come through the collection of data throughout the implementation of the project, in which one can address inspection report scores over time using the completed inspection report.  A separate item that came from this project is that in 2014 the EH group is working on another PDCA project to assess and increase the number of In-Service educational inspections at food service establishments.  The project was published through the Public Health Quality Improvement Exchange (PHQIX) in late 2013 and was a featured webinar on PHQIX in 2014.  To share our success so that others may learn from our process, the project was presented at the Illinois Public Health Association Food Safety Symposium on September 2-4, 2014.    
Food Safety
Inspections audited from February of 2013 found that 42% of inspections were in compliance when using the newly created checklist.  Inspections were initially audited by the supervisors using this checklist, but to limit subjectivity the supervisors met to test their internal consistency in how the checklist was being used to review inspection reports.  Based on this initial analysis, the group decided to implement the Plan-Do-Check-Act (PDCA) process. The entire Environmental Health Section of nine Environmental Health Practitioners, two Program Supervisors, one Administrative Assistant, and one Assistant Director were involved in the process.  The Health Data and Quality Coordinator, who is not part of the Environmental Health Section, was involved in the process for non-biased facilitation in some of the activities that took place.  All team members had an active role in the discussion, design, and implementation throughout the PDCA process.  From the results of the February baseline data an Aim Statement was created: By 05/13/2013, the EH Section will see an increase in the percentage of completely written inspection reports from 42% to 80%. On 02/13/2013 the EH staff were anonymously surveyed regarding how often they fill in each of the required fields on the inspection report.  EH staff then each completed flowcharts to indicate their individual processes for completing inspection reports.  Both tools showed variability in the procedures among the staff members. To determine the root causes of the problem the EH staff members conducted a Cause and Effect Diagram during a meeting on 03/05/13.  Based on the result of the Cause and Effect Diagram, some of the root causes determined were inconsistency in assessment by the supervisors, pressures of time and workload, and not enough group collaboration in defining what a completely written inspection form is. On 03/13/13 the EH group talked about best practices around how inspection reports are written and looked at potential solutions to ensuring completeness of inspection reports.  The EH staff brainstormed potential solutions and created an Affinity Diagram to identify the best possible method of improvement.  Based on the Affinity Diagram results and previous discussions, the group voted and selected to create an Inspection Standardization Form.  This served as tool to use in the field in which EH staff had an identified list of what should be written on the inspection form and how it should be written.  The form supplied EH staff with concise guidelines for standard inspection documentation. In selecting the creation of an Inspection Standardization Form, the prediction was that if each EH  Practitioner brought the guide and used it after each routine inspection, then the percentage of correctly written inspection reports would increase from 42% to 80% by 5/13/2013.  The form was created by the team to address the identified root cause of inconsistency and to ensure group collaboration, and the final version of the form was handed out to use between 04/13/2013 to 5/13/2013.  During this period each routine inspection was evaluated by the EH Supervisors using the inspection review checklist, the same version used to establish the February baseline data. Because the team anticipated that improvements may be seen just by identifying and working through the PDCA process, data was collected from February 2013 until the end of the PDCA cycle.  The data was collected and analyzed by the two EH Supervisors.  Bar charts created showed monthly results for each EH Practitioner and a group average based on the percentages of violations written correctly, percentages of forms filled out correctly, and percentage of completely written reports.  Bar charts were created for February, March, April, May (May 1-13), and from during the implementation period of April 13-May 13.  A line chart from February 2013 to May 2013 demonstrated the percentage of completely written inspection reports.  Individual line charts for each EH Practitioner showed by week the percentage of completely written inspection reports throughout the entire PDCA process.  Trend lines were put into these graphs to show an average positive or negative trend.  All individual data was displayed anonymously. Though the timeframe of the PDCA was to end in May 13 of 2013, the practice still continues and has not become standard practice.  In 2014 the objective was established into the health department's performance management system and is tracked and evaluation regularly.  If there is a drop in the identified goal, the EH management discusses the issues and utilizes quality improvement tools to address potential root causes and identified solutions. The largest group of stakeholders involved were the food service establishments being inspected.  This group actually services as the Voice of the Customer, because their input is utilized to see if the new practice is being effective.  In 2014 the health department implemented a new customer service process throughout the organization, in which one a quarter for an entire month every food service inspection also involves surveying the owner or operator using a 10 question system.  Their input gives information to whether the current process is being effective and valued among the food service establishments. Aside from the food service workers, the Health Data and Quality Coordinator was also involved with the project.  Throughout this process this role provided around 35 hours of in-kind services, which could account to an estimate $5,000 in actual staff time.  Because this involved the internal EH group there was no startup costs just the staff time to work throughout the process.  This accounted for 40% of work time throughout the PDCA from the project lead/EH Supervisor, 30% from the other supervisor, and 15-20% of time for the rest of the team.  
From the results of the February baseline data an Aim Statement was created: By 05/13/2013, the EH Section will see an increase in the percentage of completely written inspection reports from 42% to 80%. The data was collected and analyzed by the two EH Supervisors.  They were analyzed using the established checklist created by the group at the beginning of the project and every inspection report was reviewed and analyzed from February until May 13.  Weekly result/check sheets were also created for each inspector that contains copies of each inspection with any incomplete areas noted.  Bar charts created showed monthly results for each EH Practitioner and a group average based on the percentages of violations written correctly, percentages of forms filled out correctly, and percentage of completely written reports.  Bar charts were created for February, March, April, May (May 1-13), and from during the implementation period of April 13-May 13.  A line chart from February 2013 to May 2013 demonstrated the percentage of completely written inspection reports.  Individual line charts for each EH Practitioner showed by week the percentage of completely written inspection reports throughout the entire PDCA process.  Trend lines were put into these graphs to show an average positive or negative trend.  All individual data was displayed anonymously. Data showed an increase in completely written inspections from 42% in February to 75% by end of the PDCA cycle (05/13/2013).  The data showed increases by month in average percentages of correctly written violations, forms, and completely written reports.  Individual data also showed increases by every Environmental Health Practitioner, though variations in the degree of improvement.  Follow up occurred with individuals who did not reach the desired mark and decreased the overall group average, whereas some individuals had 100% compliance by the end of the implementation period. While the improvement did not reach the desired goal of 80%, the increase from the baseline of 42% to 75% at the end of the PDCA cycle was deemed a success by the team.  Modifications were not changed during the PDCA process, but the 80% goal was infused into the performance management system and is tracked continuously and analyzed quarterly as a performance goal.
There were many lessons learned throughout the PDCA process.  EH management learned that meeting regularly would help ensure review consistency and let staff know there is consistency among their review.  Throughout the process it was learned that inspectors had different methods on how they conducted their inspections and when they filled out the inspection report, which helped them learn from one another on best practices.  Throughout the process the PDCA lead learned that team collaboration throughout the process was the driving force to its success and was just as essential and the quality improvement tools and methods being used.  The EH management also found that having a non-biases facilitator to speak with staff separately help get information that they may have been afraid to say with management in the room. The lessons learned through partner collaboration was that through implementation of the new practice, customer service surveys in 2014 have had near-perfect results and show that the food service owners and operators feel they are getting high quality inspections and are appreciative to the work being done.  An assumption could be made that operators may not value or want inspections, whereas the survey results show very positive results for the EH staff.  The lesson learned would be that it would have been beneficial to pre-test and post-test the food service operators to see a quantifiable value to the change in the inspection form process.  The opportunity arises to evaluate the operators in the future to see how the change has benefited them.  One lesson learned would be to look at this practice on a regional level with our neighboring health department, as we follow the same regulations and many have similar inspection forms. There was no a cost-benefit analysis done but at present time there is value that can be seen by the change in the process.  By having completely written reports, the manager does not have to schedule possible unnecessary follow ups, which previously occurred if an essential piece was missing from the inspection report.  The manager also does not have to spend as much time reviewing inspection reports because there is a standard practice that staff is following.  The manager also has to spend less time meeting individually with inspectors, and now any issues can be addressed during a regular EH section meeting. The results of the customer service surveys show that sustainability is essential for our customers, which are the food service owners and operators.  To ensure sustainability, this process is now standard practice and is also used for new employee training.  To ensure as a group the 80% goal is achieve and sustained, the objective is a performance measure in the health department’s performance management system.  
Colleague in my LHD|Conference|E-Mail from NACCHO|NACCHO website
 
Processing...


Driving Walking/Biking Public Transit  Get Directions