B.C. used to have the Personal Service Establishments Regulation, a short, 3 section piece of legislation that basically said: wash your hands, and don't cause a health hazard. With the creation of the Public Health Act, the provincial government took the initiative to move some legislation around, create some new stuff, and get rid of some dead weight. What that led to was the elimination of the PSE Reg, and the creation of the Regulated Activities Regulation, a much longer 7 section piece of legislation that basically says: wash your hands, and don't go in a tanning booth if you're a minor. Essentially, they just combined the a bunch of pieces of legislation that didn't fit with anything (including the use of public toilets and soil amendment codes of practice) and put them together. With the exception of the "don't tan if you're a minor", there are no offenses pursuant to this piece of legislation. The personal services industry is basically a free-for-all, with Environmental Health Officers relying on a broad definition and interpretation of "health hazard" to keep people from getting blood-borne illnesses.
As the idea of what constitutes "body modification" spreads beyond simple tattooing and piercing, and into bifurcations, brands, and scarification, it's more necessary than ever to ensure that the public health professionals who are out there inspecting these facilities not only know what they're seeing (and the potential health risks associated with those activities), but also have the tools to enforce non-compliance when it's found.
It's not just B.C. that's struggling with the idea of how to regulate the industry: a 2011 National Post article stated that Ottawa Public Health was "given extra money for another public health inspector to focus on tattoo and piercing parlours", leading to a 136% increase in inspections in these facilities.
A study in Pennsylvania and Texas looked at protocols and procedures in personal service establishments, and compared them to established guidelines meant for use in the health care industry. Besides trying to determine what existing practices were being followed in these establishments, the researchers also wanted to see whether local/state regulations had any effect on exposure reduction activities, whether membership in professional associations made workers more likely to follow established protocols, and whether there was any difference in compliance between employees and owners of shops.
In general, the researchers found that most shops were pretty good with the infection control procedures (with a couple of exceptions), but most were also terrible with the administrative and record keeping requirements (with few exceptions). The shops knew that they needed to do things like autoclave their equipment, but didn't have written policies and procedures for exposure control, and didn't offer or document hepatitis B vaccines. When comparing the compliance results to the existence of regulation, the researchers "found little difference in compliance whether shops were located in regulated or nonregulated locations". Interestingly, some study participants stated that they felt compliance staff "had no idea how the body art industry functioned".
The study did find a difference in compliance rates between shops that were affiliated with a professional association, and those that were not: only the affiliated shops showed any compliance with the administrative standards (as minimal as that compliance was). This might indicate that the professional associations could serve as a vehicle for getting information to body modifiers in a manner that they understand and respect. Clearly, the EHO/PHI is not being seen as a voice of knowledge in this industry, so information disseminated by local health authorities would likely not be as well received as if it came internally from the industry.
This specific study highlights the concerns that are found throughout Canada with regard to the regulation of the body modification industry: there aren't many regulations out there, and if there are, they're not well enforced. Public health professionals lack the training, knowledge, and tools to adequately regulate this growth industry, leading to an increased risk to clients and workers alike. To move forward, legislators need to work with the industry to establish guidelines and regulations, and then provide adequate training to regulators and practitioners alike (preferably in the same manner, at the same time, so they can learn from one another). Just using a "don't cause a health hazard, and wash your hands" approach to regulating body modification is leaving too much to chance.
Sources: Boseveld, S. (2011, December 12). Canadian health care can't keep up with body modifications. National Post. Retrieved from http://news.nationalpost.com/2011/12/12/canadian-health-care-cant-keep-up-with-body-modifications/
Lehman, E.J., Huy, J., Levy, E., Viet, S.M., Mobley, A., & McCleery, T.Z. (2010). Bloodborne pathogen risk reduction activities in the body piercing and tattooing industry. American Journal of Infection Control, (38), 130-138.
2.27.2014
Can an inappropriate septic setback affect more than just your health?
Regardless of what your level of education is, you likely know that if your septic system isn't far enough away from your drinking water / house / swimming pool / driveway, you're going to have a bad time. There are other factors to take into consideration, of course, like soil composition, pre-discharge sewage treatment, and potential break-out areas, but in general you want to keep your septic system far away from the rest of your life.
There are certain geographic areas where a traditional on-site sewerage system just isn't going to be feasible because of high water tables, or poor soils, or just small lot sizes. In B.C., the Sewerage System Regulation is fairly outcome based: maintain a 30m setback from a well, hire an "authorized person" to do the work, and you'll be just fine. It's up to the professional expertise of the authorized person to determine appropriate setbacks and siting for the sewage systems. If you're in one of these tough geographic areas, your course of action is typically a) hire an engineer to design you a treatment plant that treats the effluent before it goes into the soil, or b) find an Onsite Wastewater Practitioner who's willing to fudge some data to get your system in the ground.
Apparently, there are considerations besides just public health when you're looking to place your septic system in a less-than-ideal location. In Ohio, only 6.4% of soils have the requisite 4' vertical separation to allow for traditional tank-and-field on-site sewerage systems. As mentioned above, there are alternatives to traditional systems, but those typically cost a fair bit more money and require some professional input (which also doesn't come cheap). A recent study has shown that costs associated with inadequate sewerage systems can include repair costs, human health costs, increased system maintenance costs, and a "loss of property valuation".
The study looked 800 on-site sewerage systems (out of a possible 22000 in Licking County, OH), of which 616 used traditional tank-and-field systems. They only looked at those for which appropriate soils data was available, leaving them with a sample size of 549 properties for the study. Using the hedonic pricing method, which "uses the different characteristics of a traded good, such as real estate, to estimate the value of a non-traded good, such as water or soil quality", the researchers were able to identify the effect that a well-functioning septic system had on property values in the county.
The researchers looked at a number of variables that could have an effect on property value (property size, number of bedrooms, etc.), but their specific hypothesis was looking at the quality of soils with regard to sewage disposal. They found that a property with soils suitable for a traditional leach field system were worth $14,062 more than a comparable property with soils unsuitable for an onsite sewerage system. Properties with soils suitable for a mound system were worth $12,897 more. This correlated to a difference of 6.8% and 6.2%, respectively. Interestingly, these price differences were actually higher than the cost in Ohio of installing a drip irrigation or mound system.
While the research outcome does a fantastic job of tying economic benefits in with good soil profiles, it's worth noting that the median value of a housing unit in the county of study was $110,700. As of January, 2014, the median value of a housing unit in the North Okanagan (where I live) is $238,750, or 2.2x higher. It would be valuable to look at similar statistics in places with higher housing costs (where onsite sewerage is prevalent, unlike Metro Vancouver) to see if similar correlations between quality soils and housing values could be found.
Source: Vedachalam, S., Hitzhusen, F.J., & Mancl, K.M. (2013) Economic analysis of poorly sited septic systems: a hedonic pricing approach. Journal of Environmental Planning and Management, 56(3), 329-344.
There are certain geographic areas where a traditional on-site sewerage system just isn't going to be feasible because of high water tables, or poor soils, or just small lot sizes. In B.C., the Sewerage System Regulation is fairly outcome based: maintain a 30m setback from a well, hire an "authorized person" to do the work, and you'll be just fine. It's up to the professional expertise of the authorized person to determine appropriate setbacks and siting for the sewage systems. If you're in one of these tough geographic areas, your course of action is typically a) hire an engineer to design you a treatment plant that treats the effluent before it goes into the soil, or b) find an Onsite Wastewater Practitioner who's willing to fudge some data to get your system in the ground.
Apparently, there are considerations besides just public health when you're looking to place your septic system in a less-than-ideal location. In Ohio, only 6.4% of soils have the requisite 4' vertical separation to allow for traditional tank-and-field on-site sewerage systems. As mentioned above, there are alternatives to traditional systems, but those typically cost a fair bit more money and require some professional input (which also doesn't come cheap). A recent study has shown that costs associated with inadequate sewerage systems can include repair costs, human health costs, increased system maintenance costs, and a "loss of property valuation".
The study looked 800 on-site sewerage systems (out of a possible 22000 in Licking County, OH), of which 616 used traditional tank-and-field systems. They only looked at those for which appropriate soils data was available, leaving them with a sample size of 549 properties for the study. Using the hedonic pricing method, which "uses the different characteristics of a traded good, such as real estate, to estimate the value of a non-traded good, such as water or soil quality", the researchers were able to identify the effect that a well-functioning septic system had on property values in the county.
The researchers looked at a number of variables that could have an effect on property value (property size, number of bedrooms, etc.), but their specific hypothesis was looking at the quality of soils with regard to sewage disposal. They found that a property with soils suitable for a traditional leach field system were worth $14,062 more than a comparable property with soils unsuitable for an onsite sewerage system. Properties with soils suitable for a mound system were worth $12,897 more. This correlated to a difference of 6.8% and 6.2%, respectively. Interestingly, these price differences were actually higher than the cost in Ohio of installing a drip irrigation or mound system.
While the research outcome does a fantastic job of tying economic benefits in with good soil profiles, it's worth noting that the median value of a housing unit in the county of study was $110,700. As of January, 2014, the median value of a housing unit in the North Okanagan (where I live) is $238,750, or 2.2x higher. It would be valuable to look at similar statistics in places with higher housing costs (where onsite sewerage is prevalent, unlike Metro Vancouver) to see if similar correlations between quality soils and housing values could be found.
Source: Vedachalam, S., Hitzhusen, F.J., & Mancl, K.M. (2013) Economic analysis of poorly sited septic systems: a hedonic pricing approach. Journal of Environmental Planning and Management, 56(3), 329-344.
Labels:
property values,
real estate,
septic,
sewage,
wastewater
Are there better methods for identifying sewage than coliforms?
Typically when health or environment officials are looking for confirmation of sewage contamination of a water source, they'll go with indicator organisms as evidence. By sampling the water and looking for fecal coliforms, you can tell whether it has been contaminated by bacteria that typically reside in the gut of warm-blooded animals. There's also fluorescein dye if you're looking to confirm that sewage isn't staying in the ground, but that's only effective if the septic failure leads to wastewater being discharged to the ground surface. In other words, if the sewage is making its way into an aquifer, you're not going to see the dye.
There are a couple of problems with fecal coliforms as indicator organisms: they're not necessarily confirmation of human sewage (i.e. just confirmation of some sort of fecal contamination) and they're not overly persistent in the soil. If you're just trying to tell a water system operator that they've got some contamination issues and need to issue a public notification, they work just fine. But if you're looking to confirm that some actual sewage is getting into the water, you're going to have a hard time in front of a judge.
Researchers from Ontario looked at a wastewater plume from a septic field serving a campground that had been in existence for around 20 years (which, incidentally, is about the life span of the average on-site sewerage system), and note that typical indicators of contamination (besides coliforms) are not necessarily unique to sewage, and therefore don't make the best indicators. Chemical compounds that are unique (ibuprofen, pseudoestrogens, carbamazapine) haven't been studied enough to give a clear indication of how long they persist in the environment. They suggest that artificial sweeteners might have value as a wastewater indicator, since they're unique to human waste, resistant to breakdown in normal sewage treatment, and persist in groundwater.
By setting up a number of piezometers and trace gas sampling points along the wastewater plume from the campground, the researchers were able to not only sample the groundwater for the contaminants of interest, but were also able to perform tritium/helium age dating to identify the age of the wastewater plume. Unsurprisingly, their study showed that once you got about 50m away from the sewage dispersal field area, nutrients and pathogens normally found in sewage were reduced to non-detectable levels. Of the sweeteners tested for persistence, they found that cyclamate and sacharrin appeared to degrade quite effectively, while acesulfame and sucralose concentrations remained relatively constant regardless of distance from the septic tank.
Since the acesulfame was detected in levels nearly 1000x higher than background concentrations in the wastewater plume, and degradation didn't occur over approximately 20 years of sewage system use, it presents itself as a potentially viable indicator for wastewater contamination. Apparently, acesulfame is also added to some animal feed, so it could be used as an indicator of groundwater contamination from manure spreading as well.
There's still some further work to be done, since this is just one study of one onsite sewerage system. However, it shows great potential for a new way of determining whether aquifers are being impacted by nearby wastewater. It's worth noting that, from a public health perspective, there is always the issue of cost when speaking to new indicators. The current culture sampling for pathogens is relatively inexpensive, and provides a "good enough" method of identifying contamination. Moving to a compound that requires some analytical chemistry for identification may just be simply too expensive for publicly funded environmental health organizations.
Source: Robertson, W.D., Van Stempvoort, D.R., Solomon, D.K., Homewood, J., Brown, S.J., Spoelstra, J., & Schiff, S.L. (2013). Persistence of artificial sweeteners in a 15-year-old septic system plume. Journal of Hydrology, 477, 43-54.
There are a couple of problems with fecal coliforms as indicator organisms: they're not necessarily confirmation of human sewage (i.e. just confirmation of some sort of fecal contamination) and they're not overly persistent in the soil. If you're just trying to tell a water system operator that they've got some contamination issues and need to issue a public notification, they work just fine. But if you're looking to confirm that some actual sewage is getting into the water, you're going to have a hard time in front of a judge.
Researchers from Ontario looked at a wastewater plume from a septic field serving a campground that had been in existence for around 20 years (which, incidentally, is about the life span of the average on-site sewerage system), and note that typical indicators of contamination (besides coliforms) are not necessarily unique to sewage, and therefore don't make the best indicators. Chemical compounds that are unique (ibuprofen, pseudoestrogens, carbamazapine) haven't been studied enough to give a clear indication of how long they persist in the environment. They suggest that artificial sweeteners might have value as a wastewater indicator, since they're unique to human waste, resistant to breakdown in normal sewage treatment, and persist in groundwater.
By setting up a number of piezometers and trace gas sampling points along the wastewater plume from the campground, the researchers were able to not only sample the groundwater for the contaminants of interest, but were also able to perform tritium/helium age dating to identify the age of the wastewater plume. Unsurprisingly, their study showed that once you got about 50m away from the sewage dispersal field area, nutrients and pathogens normally found in sewage were reduced to non-detectable levels. Of the sweeteners tested for persistence, they found that cyclamate and sacharrin appeared to degrade quite effectively, while acesulfame and sucralose concentrations remained relatively constant regardless of distance from the septic tank.
Since the acesulfame was detected in levels nearly 1000x higher than background concentrations in the wastewater plume, and degradation didn't occur over approximately 20 years of sewage system use, it presents itself as a potentially viable indicator for wastewater contamination. Apparently, acesulfame is also added to some animal feed, so it could be used as an indicator of groundwater contamination from manure spreading as well.
There's still some further work to be done, since this is just one study of one onsite sewerage system. However, it shows great potential for a new way of determining whether aquifers are being impacted by nearby wastewater. It's worth noting that, from a public health perspective, there is always the issue of cost when speaking to new indicators. The current culture sampling for pathogens is relatively inexpensive, and provides a "good enough" method of identifying contamination. Moving to a compound that requires some analytical chemistry for identification may just be simply too expensive for publicly funded environmental health organizations.
Source: Robertson, W.D., Van Stempvoort, D.R., Solomon, D.K., Homewood, J., Brown, S.J., Spoelstra, J., & Schiff, S.L. (2013). Persistence of artificial sweeteners in a 15-year-old septic system plume. Journal of Hydrology, 477, 43-54.
Labels:
bacteria,
drinking water,
E. coli,
environmental health,
groundwater,
public health,
septic,
sewage,
wastewater
How do we know if our beaches are safe?
I live in the Okanagan region of B.C., which is known for two things: wine and beaches. I might be exaggerating a bit on the "beaches" part, but the Okanagan is certainly a summer destination for a lot of people. The major municipalities and regional districts in the area have done a good job of maintaining public beach access, but with that access comes the risk of exposure to disease-causing bacteria. Historically, the local health authority would be responsible for sampling the water in close proximity to the beach, sampling the bacterial load (coliforms and fecal coliforms), and suggesting a beach closure if any individual sample was above a certain level, or if the running log mean of consecutive samples got too high. In the past couple of years, the local governments have taken on this sampling role but have used the same indicators and tests to determine whether or not a beach is "safe" for public access.
But are culture-based methods of counting coliforms really the best indicator of beach safety, or are there better methods out there? A (very large) group of researchers in the United Kingdom looked at this topic to determine whether molecular methods for enumerating coliform bacteria were better than traditional cultures. They bring up the very valid point that culture-based methods of enumeration can take a couple of days to get results back, whereas molecular methods (like quantitative polymerase chain reaction, or qPCR) can give results in a couple of hours. When dealing with beach water quality for bather safety, the quick turn-around time could be the difference between closing the beach while the hazard exists, and closing the beach once the hazard has already passed (and people have already been exposed to it).
One difficulty the paper points out with the use of qPCR is the current lack of epidemiological evidence between level of exposure and human illness. Because it's a relatively new technique, there isn't yet a strong link between a sample result and the potential for illness like there is for traditional culture counts. They also identify that the specificity of qPCR can be both a blessing and a curse: it's nice to not have to rely on indicator organisms (like E. coli) as catch-alls for human pathogens, but how do you identify the specific pathogens you want to target with the qPCR?
And of course, there's always the cost consideration. Implementing new testing methodologies can be exceedingly expensive, especially when you're talking about molecular biology. Add to that the fact that there would be a necessarily overlap between the two techniques as the transition took place, and you're looking at even higher costs. The researchers argue that the cost increase may not actually be associated with a significant benefit to public health: do people really need "real-time" beach data, or would time and money be better spent building predictive models using existing culture counts?
The researchers came up with a number of recommendations for the UK working group prior to implementation of a molecular method for determining beach safety, but the bottom line is that we're just not there yet. More research and evidence needs to be gathered, and the cost of transitioning to a new methodology needs to be reduced before local governments or health regions will consider the transition (given that there aren't that many concrete benefits).
It's also worth noting that as acute care costs rise, money for environmental health initiatives like beach monitoring necessarily decrease (see: local governments taking on sampling, as noted above). There has to be very real and clear benefits to the program to even keep health authorities involved, let alone getting them to invest new money.
Source: Oliver, David M. et al. (2014). Opportunities and limitations of molecular methods for quantifying microbial compliance parameters in EU bathing waters. Environment International, 64, 124-128.
But are culture-based methods of counting coliforms really the best indicator of beach safety, or are there better methods out there? A (very large) group of researchers in the United Kingdom looked at this topic to determine whether molecular methods for enumerating coliform bacteria were better than traditional cultures. They bring up the very valid point that culture-based methods of enumeration can take a couple of days to get results back, whereas molecular methods (like quantitative polymerase chain reaction, or qPCR) can give results in a couple of hours. When dealing with beach water quality for bather safety, the quick turn-around time could be the difference between closing the beach while the hazard exists, and closing the beach once the hazard has already passed (and people have already been exposed to it).
One difficulty the paper points out with the use of qPCR is the current lack of epidemiological evidence between level of exposure and human illness. Because it's a relatively new technique, there isn't yet a strong link between a sample result and the potential for illness like there is for traditional culture counts. They also identify that the specificity of qPCR can be both a blessing and a curse: it's nice to not have to rely on indicator organisms (like E. coli) as catch-alls for human pathogens, but how do you identify the specific pathogens you want to target with the qPCR?
And of course, there's always the cost consideration. Implementing new testing methodologies can be exceedingly expensive, especially when you're talking about molecular biology. Add to that the fact that there would be a necessarily overlap between the two techniques as the transition took place, and you're looking at even higher costs. The researchers argue that the cost increase may not actually be associated with a significant benefit to public health: do people really need "real-time" beach data, or would time and money be better spent building predictive models using existing culture counts?
The researchers came up with a number of recommendations for the UK working group prior to implementation of a molecular method for determining beach safety, but the bottom line is that we're just not there yet. More research and evidence needs to be gathered, and the cost of transitioning to a new methodology needs to be reduced before local governments or health regions will consider the transition (given that there aren't that many concrete benefits).
It's also worth noting that as acute care costs rise, money for environmental health initiatives like beach monitoring necessarily decrease (see: local governments taking on sampling, as noted above). There has to be very real and clear benefits to the program to even keep health authorities involved, let alone getting them to invest new money.
Source: Oliver, David M. et al. (2014). Opportunities and limitations of molecular methods for quantifying microbial compliance parameters in EU bathing waters. Environment International, 64, 124-128.
2.26.2014
How to keep your carrots safe: use a sharp knife
Most people probably don't think too much about how they're preparing their carrots: you peel 'em, slice 'em, eat 'em. In an industrial setting, however, the method of preparation can affect how safe they product is for consumers. If pathogens are given the opportunity to penetrate beyond the surface of the product, a food product that is eaten raw can be tough to make safe. This is why most public health legislation requires cut fruits and vegetables to be maintained at fridge temperatures: if there has been an introduction of bacteria, it won't have the opportunity to multiple / produce toxins.
A study by Irish food scientists published in volume 40 of Food Control looks at the effect of different means of slicing, peeling, and storage of carrots to identify if any controls could be put in place to reduce the risk of E. coli O157:H7 contamination. They took a bunch of carrots, dunked them in a solution of E. coli for 30 minutes, and then washed them twice for a minute with distilled water before packaging them. The packaging and storage in plastic allowed the researchers to replicate the type of environment in which industrially-produced carrots would be found. The carrots were stored in the plastic for five days, at either 4'C (recommended) or 10'C (abused).
After the storage period ended, the researchers peeled, sliced, and stored the carrots in various ways to identify differences in E. coli penetration and resilience. The carrots were peeled either by hand or using an industrial abrasion peeler. They were sliced by hand using a razor blade, or mechanically with either a dull or sharp blade on a vegetable processing machine. Storage of the cut carrots was at either 4'C or 10'C, as with the pre-cut, inoculated carrots. They also looked at the effect of gas atmosphere on the growth and survival of the bacteria, but that's not something that's easily controlled in a home or restaurant environment, so I won't go too deeply into it.
While the original analysis of the carrots showed similar surface levels of E. coli, more bacteria was able to penetrate deep into the tissue when the cutting was performed with the dull blade. As time progressed, bacterial counts were higher throughout the carrot with the dull blade vs. the razor cutting. For the peeling methods, there was originally no difference between the methods on E. coli levels throughout the carrot, though as time progressed, the industrial peeler led to higher counts throughout. It is worth noting that overall levels of E. coli, for both methods of peeling, decreased over time. In results that should surprise nobody, there was less bacterial growth at 4'C vs. 10'C, and surface bacteria grew more significantly than bacteria within the carrots.
The researchers were looking at the results of this study to guide industrial operations: how to keep produce safe between the field and the table. However, the results also have implications for the home cook or the restaurant operator. If you're planning to pre-process fruits and vegetables, it's better to cut them with a sharp knife, and keep them stored at or below 4'C. How you peel them is entirely up to you, because it doesn't seem to matter from a food safety perspective.
Source: O'Beirne, D., Gleeson, E., Auty, M., & Jordan, K. (2014). Effects of processing and storage variables on penetration and survival of Escherichia coli O157:H7 in fresh-cut packaged carrots. Food Control, (40), 71-77.
A study by Irish food scientists published in volume 40 of Food Control looks at the effect of different means of slicing, peeling, and storage of carrots to identify if any controls could be put in place to reduce the risk of E. coli O157:H7 contamination. They took a bunch of carrots, dunked them in a solution of E. coli for 30 minutes, and then washed them twice for a minute with distilled water before packaging them. The packaging and storage in plastic allowed the researchers to replicate the type of environment in which industrially-produced carrots would be found. The carrots were stored in the plastic for five days, at either 4'C (recommended) or 10'C (abused).
After the storage period ended, the researchers peeled, sliced, and stored the carrots in various ways to identify differences in E. coli penetration and resilience. The carrots were peeled either by hand or using an industrial abrasion peeler. They were sliced by hand using a razor blade, or mechanically with either a dull or sharp blade on a vegetable processing machine. Storage of the cut carrots was at either 4'C or 10'C, as with the pre-cut, inoculated carrots. They also looked at the effect of gas atmosphere on the growth and survival of the bacteria, but that's not something that's easily controlled in a home or restaurant environment, so I won't go too deeply into it.
While the original analysis of the carrots showed similar surface levels of E. coli, more bacteria was able to penetrate deep into the tissue when the cutting was performed with the dull blade. As time progressed, bacterial counts were higher throughout the carrot with the dull blade vs. the razor cutting. For the peeling methods, there was originally no difference between the methods on E. coli levels throughout the carrot, though as time progressed, the industrial peeler led to higher counts throughout. It is worth noting that overall levels of E. coli, for both methods of peeling, decreased over time. In results that should surprise nobody, there was less bacterial growth at 4'C vs. 10'C, and surface bacteria grew more significantly than bacteria within the carrots.
The researchers were looking at the results of this study to guide industrial operations: how to keep produce safe between the field and the table. However, the results also have implications for the home cook or the restaurant operator. If you're planning to pre-process fruits and vegetables, it's better to cut them with a sharp knife, and keep them stored at or below 4'C. How you peel them is entirely up to you, because it doesn't seem to matter from a food safety perspective.
Source: O'Beirne, D., Gleeson, E., Auty, M., & Jordan, K. (2014). Effects of processing and storage variables on penetration and survival of Escherichia coli O157:H7 in fresh-cut packaged carrots. Food Control, (40), 71-77.
Labels:
E. coli,
environmental health,
food,
food processing,
food safety
Do people learn from their food hygiene mistakes?
While a large proportion of people who come down with a food-borne illness will blame the last restaurant they ate at (regardless of where or when that was), in reality, many cases come from home kitchens. Whether people fail to properly clean and sanitize their food contact surfaces, or they undercook their potentially hazardous food items, there are more risks at home to becoming ill than when you eat in a restaurant (unless, of course, you've developed HACCP plans for your home recipes, in which case: kudos).
Having a food-borne illness is not a fun time. It's exceedingly unpleasant. So once people have gone through that misery, do they learn anything from it? In other words, will they continue to make the same mistakes, or will they practice exceptional food handling procedures to ensure it never, ever, ever happens to them again. A study in volume 41 of Food Control looked to answer that very question, by looking at the food handling and kitchen hygiene of individuals who had previously suffered from Campylobacter infections.
The case control study used a survey to identify behaviors and perceptions about food safety, and followed up with a "kitchen sampling programme" among a sub-set of the cases and controls. Cases were identified as adults who had lab-confirmed campylobacteriosis, and were age/gender/geography matched to controls. The survey asked some general questions about food practices, as well as some questions that were specifically included to identify optimistic bias (e.g. asking what risk of illness came from their kitchen, vs. the risk in the average home kitchen). The kitchen sampling was done with environmental swabs on counters and cutting boards, and an analysis of the dish towel for bacterial growth.
In identifying optimistic bias, the researchers found that everybody thinks their kitchen is safer than the average home. Whether it was cases or controls, nearly 60% of respondents indicated that the average person is "at a significantly greater risk of getting food poisoning" than themselves. Interestingly, when the same question was asked 6 months in the future (i.e. after the campylobacteriosis was a distant memory), the cases had less of an increase in optimistic bias than the controls. Perhaps, with time to reflect on their illness, they were less convinced that they were as amazing at kitchen hygiene as they had originally thought. It's also noted that controls, who have not yet become ill, likely have increased confidence as time goes on and they continue to remain healthy.
In the "kitchen behavior" part of the survey, cases were found to be more likely to state that they wash their raw chicken pieces and their pre-washed bagged salads. These actions are not recommended, as washing your raw chicken increases the chance of cross-contamination, and your bagged salad is already washed much more thoroughly than you'd be able to achieve. Cases over the age of 60 were much more likely to state that they washed their chicken and their bagged salad.
While the survey part of the study relies on people's honesty and memory in answering questions, the kitchen sampling programme cannot be faked. Based on the responses above and the fact that the cases actually had food poisoning, one would expect their kitchen hygiene to be marginally worse than the controls. In reality, however, there was no difference between the two groups whatsoever. It was noted that the participants were warned that the samplers were coming ahead of time, which could have allowed them to clean beforehand, but they weren't made aware of where the samplers would swab.
The bottom line is that there is still work to be done in convincing the public that kitchen hygiene behaviors at home are an important part of reducing cases of food poisoning. The fact that the controls became more optimistic about their behaviors over time, and that the >60 year old age cohort was practicing unsafe food handling procedures shows that the education is not sinking in. As mentioned in the opening paragraph, people generally like to blame the last restaurant they ate at for their illness, a fact reflected in the optimistic bias. Ensuring that people are aware that their home-cooked meals can cause illness is the first step to reducing the burden on the health care system.
Source: Millman, C., Riby, D., Edward-Jones, G., Lighton, L., & Jones, D. (2014). Perceptions, behaviours and kitchen hygiene of people who have and have not suffered campylobacteriosis: A case control study. Food Control, 41, 82-90.
Having a food-borne illness is not a fun time. It's exceedingly unpleasant. So once people have gone through that misery, do they learn anything from it? In other words, will they continue to make the same mistakes, or will they practice exceptional food handling procedures to ensure it never, ever, ever happens to them again. A study in volume 41 of Food Control looked to answer that very question, by looking at the food handling and kitchen hygiene of individuals who had previously suffered from Campylobacter infections.
The case control study used a survey to identify behaviors and perceptions about food safety, and followed up with a "kitchen sampling programme" among a sub-set of the cases and controls. Cases were identified as adults who had lab-confirmed campylobacteriosis, and were age/gender/geography matched to controls. The survey asked some general questions about food practices, as well as some questions that were specifically included to identify optimistic bias (e.g. asking what risk of illness came from their kitchen, vs. the risk in the average home kitchen). The kitchen sampling was done with environmental swabs on counters and cutting boards, and an analysis of the dish towel for bacterial growth.
In identifying optimistic bias, the researchers found that everybody thinks their kitchen is safer than the average home. Whether it was cases or controls, nearly 60% of respondents indicated that the average person is "at a significantly greater risk of getting food poisoning" than themselves. Interestingly, when the same question was asked 6 months in the future (i.e. after the campylobacteriosis was a distant memory), the cases had less of an increase in optimistic bias than the controls. Perhaps, with time to reflect on their illness, they were less convinced that they were as amazing at kitchen hygiene as they had originally thought. It's also noted that controls, who have not yet become ill, likely have increased confidence as time goes on and they continue to remain healthy.
In the "kitchen behavior" part of the survey, cases were found to be more likely to state that they wash their raw chicken pieces and their pre-washed bagged salads. These actions are not recommended, as washing your raw chicken increases the chance of cross-contamination, and your bagged salad is already washed much more thoroughly than you'd be able to achieve. Cases over the age of 60 were much more likely to state that they washed their chicken and their bagged salad.
While the survey part of the study relies on people's honesty and memory in answering questions, the kitchen sampling programme cannot be faked. Based on the responses above and the fact that the cases actually had food poisoning, one would expect their kitchen hygiene to be marginally worse than the controls. In reality, however, there was no difference between the two groups whatsoever. It was noted that the participants were warned that the samplers were coming ahead of time, which could have allowed them to clean beforehand, but they weren't made aware of where the samplers would swab.
The bottom line is that there is still work to be done in convincing the public that kitchen hygiene behaviors at home are an important part of reducing cases of food poisoning. The fact that the controls became more optimistic about their behaviors over time, and that the >60 year old age cohort was practicing unsafe food handling procedures shows that the education is not sinking in. As mentioned in the opening paragraph, people generally like to blame the last restaurant they ate at for their illness, a fact reflected in the optimistic bias. Ensuring that people are aware that their home-cooked meals can cause illness is the first step to reducing the burden on the health care system.
Source: Millman, C., Riby, D., Edward-Jones, G., Lighton, L., & Jones, D. (2014). Perceptions, behaviours and kitchen hygiene of people who have and have not suffered campylobacteriosis: A case control study. Food Control, 41, 82-90.
2.25.2014
A new way to keep meat safe?
Forthcoming in the July, 2014 issue of the journal Food Control is a study assessing the efficacy of bacteria obtained from the gut of veal in protecting against potentially pathogenic organisms in raw, vacuum packed meat. Using the bacteria Lactobacillus animalis SB310 and Lactobacilus paracasei subspecies paracasei SB137, the researchers looked to see whether inoculating raw, vacuum-packed meat could prevent spoilage and pathogenic organisms from multiplying to levels that would cause a risk to food safety. Essentially, the goal is to apply bacteria to meat to prevent bacteria from growing on the meat.
The bacteria noted above were identified as potential candidates for this "biopreservation" because they're lactic acid bacteria (LAB), which compete for nutrients with pathogenic organisms and produce antimicrobial compounds, while remaining safe to eat for humans. The study looked at a number of potentially pathogenic organisms that might be found in meat, including E. coli, Salmonella, Yersinia enterocolitica, and Listeria monocytogenes. The results were generally positive: whether alone, or in combination, the Lactobacillus strains were successful in limiting the growth of the pathogenic bacteria. With a couple of exceptions, combining the strains was much more successful at reducing bacterial growth than applying them individually.
The second part of the study performed by the researchers was to determine whether the inhibition was due to the Lactobacillus cells producing a compound that prevented the growth of pathogens. Since they had already shown that the presence of the cells themselves were effective, they simply removed the cells from the growth medium by way of centrifuge, and applied the supernatant to the growth plates. The supernatant produced no noticeable reduction in the growth of the pathogens. Based on the lack of success of the supernatant inhibiting pathogenic growth, the authors posit that the production of organic acids by the Lactobacillus species is responsible for preventing other bacteria from growing.
It's worth noting that while the study did show that the two Lactobacillus species had an inhibitory effect on the growth of potentially pathogenic organisms, it was very much an in situ experiment. There is still a lot of work to be done in determining how to actually inoculate pieces of meat with this type of organism to provide a net food safety benefit. However, this is an exciting step forward in preventing pathogens from multiplying in meat: biopreservation is natural, safe to consume, adds no chemicals, and does not alter the taste or texture of the product - all things that consumers want.
Source: Tirolini, E., Cattaneo, P., Ripamonti, B., Agazzi, A., Stella, S., & Bersani, C. (2014). In vitro evaluation of Lactobacillus animalis SB310, Lactobacillus paracasei subsp. paracasei SB137 and their mixtures as potential bioprotective agents for raw meat. Food Control, 41, 63-68
The bacteria noted above were identified as potential candidates for this "biopreservation" because they're lactic acid bacteria (LAB), which compete for nutrients with pathogenic organisms and produce antimicrobial compounds, while remaining safe to eat for humans. The study looked at a number of potentially pathogenic organisms that might be found in meat, including E. coli, Salmonella, Yersinia enterocolitica, and Listeria monocytogenes. The results were generally positive: whether alone, or in combination, the Lactobacillus strains were successful in limiting the growth of the pathogenic bacteria. With a couple of exceptions, combining the strains was much more successful at reducing bacterial growth than applying them individually.
The second part of the study performed by the researchers was to determine whether the inhibition was due to the Lactobacillus cells producing a compound that prevented the growth of pathogens. Since they had already shown that the presence of the cells themselves were effective, they simply removed the cells from the growth medium by way of centrifuge, and applied the supernatant to the growth plates. The supernatant produced no noticeable reduction in the growth of the pathogens. Based on the lack of success of the supernatant inhibiting pathogenic growth, the authors posit that the production of organic acids by the Lactobacillus species is responsible for preventing other bacteria from growing.
It's worth noting that while the study did show that the two Lactobacillus species had an inhibitory effect on the growth of potentially pathogenic organisms, it was very much an in situ experiment. There is still a lot of work to be done in determining how to actually inoculate pieces of meat with this type of organism to provide a net food safety benefit. However, this is an exciting step forward in preventing pathogens from multiplying in meat: biopreservation is natural, safe to consume, adds no chemicals, and does not alter the taste or texture of the product - all things that consumers want.
Source: Tirolini, E., Cattaneo, P., Ripamonti, B., Agazzi, A., Stella, S., & Bersani, C. (2014). In vitro evaluation of Lactobacillus animalis SB310, Lactobacillus paracasei subsp. paracasei SB137 and their mixtures as potential bioprotective agents for raw meat. Food Control, 41, 63-68
Labels:
biopreservation,
environmental health,
food,
food safety,
meat
In "no way, really?" news ...
A study in this month's "Critical Reviews in Food Science and Nutrition" (Vol. 54, Issue 9) has shown a correlation between eating away from the home, and anthropometric changes (e.g. obesity, increase in waist circumference).
The authors reviewed 15 prospective studies, and following a review of the quality of the data contained within, selected seven of them for analysis. What they found probably shouldn't surprise anybody: if you eat away from home "frequently", you're probably going to weigh more than somebody who eats at home less "frequently". The authors of the review further looked at different types of food sources away from the house, and (surprise again!) found that fast-food outlets had a higher correlation with these negative anthropometric implications than traditional restaurants. It's noted in the article that "other out-of-home foods" are lacking the research to declare correlation between consumption and negative body changes, and suggests more research be undertaken.
While nobody should be surprised by the results of this literature review, it's likely that the restaurant industry will be quick to point out that other factors lead to a healthy lifestyle besides just your choice of eating location, including exercise, lifestyle, genetics, etc. Though the article looked at long-term prospective studies, it did not indicate any other potential sources of anthropometric change besides where the meals were consumed.
Another consideration is that the articles reviewed by the authors range in date published between 1998 and 2011. There have been many changes in the types of food being served "away from the home" over the past 15 years, both for marketing reasons (I'm looking at you fast-food salads) and for legislative reasons (see: BC's Public Health Impediments Regulation dealing with trans fats). Presumably, a prospective study undertaken now and published in 10 years time would show similar outcomes, but perhaps not quite to the same extent.
Policy makers and food-industry regulators should take heed of this research. Though some legislation has been put in place that makes "out-of-home food" more healthy (see above in re trans-fat laws), there is still work to be done to ensure that when people are unable to eat at home for whatever reason, they are not putting themselves at risk of significant health issues.
Source: Dossa, R. A., Nago, E. S., Lachat, C. K., & Kolsteren, P. W. (2014). Association of Out-of-Home Eating with Anthropometric Changes: A Systematic Review of Prospective Studies. Critical Reviews in Food Science and Nutrition, 54(9), 1103-1116.
The authors reviewed 15 prospective studies, and following a review of the quality of the data contained within, selected seven of them for analysis. What they found probably shouldn't surprise anybody: if you eat away from home "frequently", you're probably going to weigh more than somebody who eats at home less "frequently". The authors of the review further looked at different types of food sources away from the house, and (surprise again!) found that fast-food outlets had a higher correlation with these negative anthropometric implications than traditional restaurants. It's noted in the article that "other out-of-home foods" are lacking the research to declare correlation between consumption and negative body changes, and suggests more research be undertaken.
While nobody should be surprised by the results of this literature review, it's likely that the restaurant industry will be quick to point out that other factors lead to a healthy lifestyle besides just your choice of eating location, including exercise, lifestyle, genetics, etc. Though the article looked at long-term prospective studies, it did not indicate any other potential sources of anthropometric change besides where the meals were consumed.
Another consideration is that the articles reviewed by the authors range in date published between 1998 and 2011. There have been many changes in the types of food being served "away from the home" over the past 15 years, both for marketing reasons (I'm looking at you fast-food salads) and for legislative reasons (see: BC's Public Health Impediments Regulation dealing with trans fats). Presumably, a prospective study undertaken now and published in 10 years time would show similar outcomes, but perhaps not quite to the same extent.
Policy makers and food-industry regulators should take heed of this research. Though some legislation has been put in place that makes "out-of-home food" more healthy (see above in re trans-fat laws), there is still work to be done to ensure that when people are unable to eat at home for whatever reason, they are not putting themselves at risk of significant health issues.
Source: Dossa, R. A., Nago, E. S., Lachat, C. K., & Kolsteren, P. W. (2014). Association of Out-of-Home Eating with Anthropometric Changes: A Systematic Review of Prospective Studies. Critical Reviews in Food Science and Nutrition, 54(9), 1103-1116.
Labels:
fast-food,
food,
health,
restaurants,
science
Subscribe to:
Posts (Atom)