Accreditation: Using the CCHSA's New AIM Standards--Perspective of a Team Leader in a Partner Facility

Barb Goertzen, CCHRA©
Co-sponsored by the Canadian Council on Health Services Accreditation


My role in this accreditation survey was that of team leader of a small facility. Because this accreditation differed so much from any other survey, I have attempted to analyze why the process was so dynamic. Apparent at the outset was that the Canadian Council on Health Services Accreditation had delivered on plans to make the survey into a continuous improvement process and to measure outcomes. Further along, we saw an incredible level of buy-in and involvement from many different levels of staff. This means, that though department heads were quite involved, front line staff and governance became active participants. All team members were impressed with our team's growth and success. Aside from planning and facilitating, I did little work and each member seemed unaware of how much work they did towards the process. Two community stakeholders--a pastoral care representative and a family support member, supported this team. Finally, management support on the team consisted of two dedicated board members and our very positive and supportive CEO. These people inspired the trust that is necessary to set people free to work uninhibited as a team and valued evidence-based decision making which is inherent to accreditation.

Canadian Council on Health Services Accreditation Structure

The Canadian Council on Health Services Accreditation is a not-for-profit organization similar to that in other countries (such as our American counterpart, the Joint Commission on Accreditation of Healthcare Organizations). The Canadian Organization was incorporated in 1958 out of the Joint Commission. Recently (1995) their name was changed from the Canadian Council for Health Facilities Accreditation to the Canadian Council for Health Services Accreditation (CCHSA) in order to reflect organizations accredited. This encompasses the shift from an active hospital focus to envelop all aspects of health. It now includes many programs delivered on an outpatient basis, home care and patient wellness programs. It also seeks to accommodate the change in structure from individually administered Boards for each facility to a Regional Board Structure. The CCHSA has recently been among the winners of the Conference Board of Canada/Spencer Stuart National Award in Governance. Specifically it is a result of the CCHSA's self-evaluation program.

Team Experience

The Coaldale Health Care Centre was formerly known as the Coaldale Community Hospital. It is run by a private board and has been accredited since small hospitals were eligible (1960s). The Board is chosen from shareholders in the Coaldale Community Hospital Association (formerly the Coaldale Mennonite Health Society).


The Canadian Council for Health Services Accreditation uses an AIM (Achieving Improved Measurement) process. This quality improvement focus covers a three-year cycle program. The definition of quality is "An organizational philosophy that seeks to meet clients' needs and exceed their expectations by using a structured process that selectively identifies and improves all aspects of service.” The CCHSA uses a

  • Population health focus  
  • Team-based process

Team Experience

The Coaldale Health Care Centre was well versed in the AIM philosophy by a facility wide Quality Assurance Program followed by TQI initiatives. This facility is in a Health Region in Alberta. There were 17 Health Regions at the time of this assessment. They have since been reduced to seven. The Region was assessed as a whole: Coaldale's participation under the Chinook Health Region was as a privately run hospital or--in terms of accreditation terminology--"A Partner Facility."

Part of a quality improvement process involves alignment with a mission and philosophy and maintaining this while determining a facility's role. The CCHC has a philosophy based on the Christian vision of healing incorporating both physical and spiritual dimensions. Thus, in addition to all-departmental representation, there was pastoral care participation and the pastoral care vision was referenced throughout the survey. Community stakeholder involvement was a very new concept to an accreditation survey. While many community volunteers had actively supported this facility in the past, this was the first time that a recipient-of-care representative was involved in this level of evaluation.

Other members of the team involved our governance including the CEO and two board members and a physician. The participating physician joined the team based on time available (partially retired) to attend meetings. However, unbeknownst to us, he had some very good experience with the accreditation process and identification of areas for improvement.

Having a good idea of where I wanted to go with this team did not include how to get there. The physician who read our final survey and instantly picked up on listing our lower rated standards to focus on in our quality improvement projects and our CEO who lead the prioritizing rating of these are an example of how these team members jumped in and did their work. Even front line staff members were not afraid to suggest good ways of accomplishing our tasks.

Survey Standards and Evaluation

The CCHSA has a national set of indicators. The standards are assessed on a seven-point numerical compliance scale. Standards assess quality in terms of four dimensions, which are as follows:

  • Responsiveness
  • Systems Competency
  • Client/Community Focus
  • Worklife

Descriptors further elaborate each dimension above. Accreditors assign a risk factor based from a risk rating scale evaluating their recommendations. Accreditors rate:

  • Quality
  • Risk
  • High recommendations related to specific issues as identified by the CCHRSA
  • The number of high recommendations in any given team.

Team Experience

Terminology was a large hurdle to undergo in accomplishing this accreditation. Even though the team was exposed to an educational session, this needed to be reviewed as we progressed. In support of each rating that the team awarded on our self-assessment, it was helpful again to refer back to the dimension that was being reviewed. Some of the standards looked the same but were measuring a different dimension.

Survey Types

There are many different surveys designed to represent all stakeholders and functions within the health delivery process.

Team Experience

This facility initially completed the Leadership and Partnership Survey. It was later discovered that as an independent facility within the region, we were also obliged to complete additional surveys to represent our governance, finance, information systems and environmental services.

Survey Completion Process

Senior management underwent an education session for this process. There was a two-day orientation session done by CCHSA for all departmental representation explaining the new AIM process and definitions. This was the first survey in electronic format in our province. A major process of survey completion was to identify and begin work on one or two major identified quality improvement processes. There were many program-focused surveys such as Population Health, Community Health, Emergency Services, Children's Health, Women's Health, etc. Some surveys covered each private facility in the region including Nursing Homes and two religious based private facilities, one of which was the Coaldale Health Centre. Program centered surveys naturally overlapped with facility based surveys which added to the continuity. The 15 teams underwent some team-building exercises.

Team Experience

Our team had 13 members so we asked a few front line staff to attend when they could, expanding the team to 15. Initially, as there were so many standards to work through, it was thought we could break up into groups to complete the survey. Some of the other teams coincidentally had this same idea as many teams had an average of 14 members. However at the first session, it was suggested we work through one section together before we broke off into groups. The dynamics of the group enabled the team to work together through all the standards, though needing more meetings than initially planned for.

Along with the definition and process teaching in regards the survey as previously mentioned, data input was completed after working through each section of the survey and printed out for review at the next meeting. Two lists were maintained as we went along; an abbreviation & acronym list and a list of references to back up each rating.

At the end of our self-evaluation, each team member reviewed a full printout of our electronically entered surveys. Lower ratings were compiled and grouped into possibilities for self-improvement processes. A grouping process cemented two projects for focus. Plans were made for reviewing ongoing progress through built in organizational structure. Halfway through the main survey review process, it was determined that four further surveys would need completion. These were assigned to various departmental members of the team who were encouraged to elicit other staff participation to help them complete each survey.

Monthly Team Leader Meetings within the Region enforced my role as Team Leader. We were updated on new information; questions and answers were shared, and continued to provide progress reports. At the end of all survey completion, the aggregate of electronic data for about 15 regional teams was combined in the database.

Regional team support was given by enabling each team to participate or observe a one-hour Mock Survey.

Our facility also had input on other teams; we were part of the Emergency Care and the Continuing Care Programs.

Survey Outcome Evaluation

In addition to the survey site visit, the survey team interview and the survey evaluation, accreditation teams assess outcomes by three region-organized focus groups:

  • Staff
  • Clients/Patients
  • The organization's community stakeholders

Team Experience

This survey differed from other surveys in newer provincial legislation covering privacy dictating consent procedures for interviewing. We now also have federal privacy legislation that will continue to be a challenge.

Identifying possibilities for health satisfaction interviews was a function I performed. As a non-caregiver and non-administrative person, I enlisted the help of nursing and rehab staff to guide who and how to choose for our long-term resident interviews. This process was particularly rewarding in that this opportunity for continuing care patient/resident contact seldom happens in a Health Records Department. Also it enabled me to see the value of the process of assessing patient satisfaction as a non-caregiver. Though there was some concern about selection of positive feedback, I have always been sufficiently convinced of the quality of care that, whomever was to be interviewed, would give a good measure of the quality. An active inpatient interview was a concern, as we cannot plan on these short-term patients but were fortunate to have a patient that had experienced health care in other settings and was able to give a good reasonable interview.

Murphy's Law: This facility is well supported by fundraising and input from the community. Additions realized include a chapel, a garden/landscape project and a solarium. We had planned to conduct interviews in the brand new solarium but a new breakfast program for long term residents was scheduled for there and we did not find out until the last minute. Last minute accommodation included using patient/resident rooms and our physician lounge for resident/facility-user interviews.

Survey Week and Immediate Regional Feedback Session

As a private facility within the Region, two surveyors spent a half-day at the facility. The Coaldale team met the surveyors in a one-hour meeting.

Team Experience

The team was very relaxed and frank with the accreditation surveyors. Even those not as vocal and participative during our survey completion meetings made an effort to be expansive in answering accreditation surveyor queries.

At the end of the regional weeklong survey, all team members were welcomed to an immediate feedback session at the central site on the last afternoon. Though some minor panic was felt about the frankness of our team's comments, it may have reflected the accreditors' discernment skills and subsequent recommendation. The immediate feedback was a new process and a good end to the survey.

Survey Results

The final survey report from the CCHSA reached the region substantially later. This was a health region survey so they were tabulated as such and released to the health region authority in our nearby large centre.

Team Experience

These favorable survey results were shared with each Regional Team who then shared them with the team members. Because it was a regional survey, individual sites became part of the whole health delivery assessment and individuality remains with each perceptional experience.


Though the CCHSA's first AIM surveys were an impressive experience, they have not remained static. Their process has evolved based on commonalties of results from their very first AIM surveys. Following their recent processes--for example, this year's accreditation focus is on Patient Safety--creates an urge to share these new initiatives. However, we need to recognize that the first survey processes provided a very important foundation for each subsequent survey. The fact that all those involved, including all levels of staff, all levels of management, patients, residents, and community stakeholders have gained new insight and enthusiasm for this process speaks to the value of accreditation.


Source: 2004 IFHRO Congress & AHIMA Convention Proceedings, October 2004