Development of a competency framework for postgraduate training in obstetrics and gynaecology using a Delphi study
Ellen Allaert1, Marieke Robbrecht2, Tjalina Hamerlynck1 and Steven Weyers1
1Ghent University, Faculty of Medicine and Health Sciences, and Womens Clinic, Ghent University Hospital, Belgium
2Ghent University, Faculty of Medicine and Health Sciences, and Department of Paediatrics, Ghent University Hospital, Belgium
Submitted: 18/03/2024; Accepted: 01/02/2025; Published: 24/02/2025
Int J Med Educ. 2025; 16:21-35; doi: 10.5116/ijme.679e.0509
© 2025 Ellen Allaert et al. This is an Open Access article distributed under the terms of the Creative Commons Attribution License which permits unrestricted use of work provided the original work is properly cited. http://creativecommons.org/licenses/by/3.0
Abstract
Objectives: The aim of this study was to create a new integrated competency framework for the postgraduate training in obstetrics and gynaecology and to reach consensus through a Delphi study.
Methods: Using the Canadian Medical Education Directives for Specialists (CanMEDS) framework as a basis, three existing frameworks were merged by screening for keywords. Subsequently, consensus on the unified framework was reached through a Delphi study: a group of 18 Belgian experts was asked for their opinions on the competencies through three successive questionnaires.
Results: In the first round, one of the in total 91 competencies was deemed irrelevant. In the second round, the competencies were reviewed for content and formulation, after which consensus was not reached on 15 competencies. These 15 competencies were adjusted as needed based on comments collected during the first two rounds. The adjusted competencies were then sent back to the experts in the third round, resulting in a final consensus on all 91 competencies. However, the comments indicated that several competencies were considered broad or vague, casting doubt on their practical applicability.
Conclusions: Through a Delphi study, consensus was reached on a newly composed competency framework. Such a holistic competency framework can form the basis of a curriculum reform in the postgraduate training in obstetrics and gynaecology within Belgium, but also in a more international context. Further research is needed to develop an assessment tool to implement these competencies in practice.
Introduction
Since the turn of the century, Competency-Based Medical Education (CBME) has found its implementation in residency training.1 Traditionally, medical education is focused on time-based education, where the focus lies on the process of learning, and on the exposure to learning opportunities for specified periods of time.2-4 Competency-based education emphasises on the product of learning: the outcome drives the educational process. How the outcome is reached is secondary.1-3, 5, 6
Although Competency-Based Education (CBE) has been implemented in multiple professional environments for many years (e.g. NASA, the military), it is only since the last decades that the benefits of CBE in medicine became clear.7 CBME enhances transparency in medical education, empowers students, enables comparison of curricula in different countries, and improves patient safety.1, 7-9 Moreover, CBME addresses the needs for evidence-based, cost-effective, patient-centred health care.10 Many countries, such as the United States, Canada, Australia, the Netherlands and the United Kingdom, already use a competency-based national educational program for residency training.1, 4, 11
Over the years, specific competency frameworks have been developed for postgraduate training in obstetrics and gynaecology, such as the Objectives of Training in the Specialty of Obstetrics and Gynecology12 in Canada and the Dutch national training program for gynaecology and obstetrics (‘LOGO’).13 In Belgium no uniform framework exists for postgraduate training in obstetrics and gynaecology. The general CanMEDS framework has been adopted in the curriculum of several Belgian universities for undergraduate training.11,14 A framework defined for MMSM (Master of Medicine in Specialist Medicine) is based on the CanMEDS competency framework, utilizing four CanMEDS roles as the foundation for the rest of the framework. This framework is used for feedback and evaluation during workplace-based learning in all specialistic medicine disciplines. The original CanMEDS framework is not included in postgraduate training in Belgium. The competency framework of the European Union of Medical Specialists (UEMS), more specific the European Board and College of Obstetrics and Gynaecology (EBCOG) and the European Network of Trainees in Obstetrics and Gynaecology (ENTOG), provides an overview of all the knowledge and technical skills that a trainee in obstetrics and gynaecology should master upon certification.15 It is used as a guideline for training and assessmentbut is not officially incorporated in the postgraduate curriculum. The absence of a uniform framework complicates the evaluation of residents from different institutions and hinders the final assessment of medical competence for certification. At an international level, the variation hampers the evaluation of a resident's level of proficiency and therefore complicates exchange of residents and knowledge.
Given the increasing importance of high-quality care and the growing internationalisation of medicine, it is crucial that the postgraduate training in obstetrics and gynaecology in Belgium is standardised. A uniform competency framework can also serve as a basis within a more international context, simplifying the process for international equivalence of diplomas. In this study, the goal was to develop a new competency framework, based on the merging of the three existing frameworks. As a Delphi study is typically used for defining competencies and curriculum development, this study method was chosen to subsequently achieve consensus on the newly merged competency framework.33
This new competency framework will be used for the SBO Scaffold project, aiming to design an evidence-based ePortfolio that supports healthcare students in their competence development at the workplace.16 Considering the international background of the used frameworks, this could be an example for other countries to reform their curriculum or assessment tools.
Methods
Generating the competency framework
In order to construct a new competency framework, three pre-existing frameworks were used: The CanMEDS roles as defined by The Royal College of Physicians and Surgeons of Canada12, the European Training Requirements in Obstetrics and Gynaecology as defined by the UEMS 15 and the competencies as defined by the MMSM.17
The CanMEDS competencies were selected as a foundation for the new framework because these have already been validated and implemented in the postgraduate medical training in numerous countries.11, 18-21 The integration of the UEMS framework was essential because this framework was recently developed to improve the European standards of training.22 Since the competencies as defined by MMSM are already used by several Belgian universities, these could not be left out.
First, all the CanMEDS competencies were listed out in a Microsoft Excel file by the main investigator (EA). Subsequently, the UEMS competencies were linked to the matching CanMEDS roles and consecutively to the matching key competencies and enabling competencies. During the process, this step was reviewed by a specialised research group associated with the SBO Scaffold project. All steps were repeated for the competencies of the MMSM framework.
After linking all UEMS and MMSM competencies to the CanMEDS enabling competencies, the entire list was checked for gaps and overlap. Where possible, overlapping competencies were merged based on contained keywords. If combining competencies was not possible without changing the essence, these UEMS or MMSM competencies were added to the list of CanMEDS competencies. This step was again reviewed by the research group. In total 33 UEMS competencies and 33 MMSM competencies were merged with the 89 CanMEDS enabling competencies to create a new framework of 91 competencies. A flowchart of these steps can be found in Figure 1.
Study design
A Delphi methodology was used to reach consensus on the developed competency framework through an online questionnaire. Classically, a Delphi procedure consists of several rounds, during which expert opinions are assimilated using controlled feedback, ultimately leading to a group consensus.11, 24-28 It has a quasi-anonymous design: the identity of the respondents is known to the main investigator, but their answers and opinions remain strictly anonymous to the other respondents.26, 29
For this study, an e-Delphi was used, which is a web-based technique using online questionnaires. An online format facilitates participation from diverse geographical locations, is cost-effective and allows respondents to fill in the questionnaire at a moment that suits their personal agenda.25, 30
Design questionnaire
During the first Delphi round, participants were asked to assess all competencies for relevance by using a 6-point Likert scale (1 = not at all relevant to 6 = very relevant).28 In addition to the Likert scale, participants were able to give comments or specify their chosen answer during every round. The second round contained the list of all competencies, complemented by the level of consensus reached for each competency and the qualitative comments collected in the first round.11, 29 During this round, participants were asked to review the formulation and content and indicate whether they agreed with leaving the competency as it currently stands or not. All competencies without consensus regarding formulation or content were retained and, if necessary, adjusted, based on the feedback of the experts. This process was done in consultation with the research group and two appointed gynaecologists (SW, TH) involved in training residents in obstetrics and gynaecology at the Ghent University Hospital. All adjusted competencies were submitted to the participants during the third round, to determine whether they agreed with the implemented adjustments or not. This in order to reach total group consensus.27
The survey was not piloted, indication of time required to complete the survey, reliability and feasibility of the study was ensured by a similar study conducted by a researcher in the research group.23
Selection of experts
Participants of a Delphi study are considered to be experts in their field with knowledge about the area of research.24,28,29 Choosing appropriate experts with knowledge and interest in the topic has a direct positive correlation to the quality of the results, the risk of bias and the content validity of the study.26, 29, 31
For this study, five groups of experts were included: 1) the recognition committee for obstetrician-gynaecologists in Flanders, 2) supervisors of residents-in-training for obstetrics and gynaecology affiliated with a Belgian University, 3) experts in medical education, involved in competency-based education, 4) recently (2020) recognised obstetrician-gynaecologists, 5) a member of the EBCOG.
Ethical approval was obtained from the Ethics Committee of the Ghent University Hospital. Informed consent was obtained from all participants.
Data collection
An email request for participation was sent to all eligible experts. All respondents subsequently received a personal link by email to the online questionnaire of round 1, created with Qualtrics. The same steps were followed for rounds 2 and 3. There was a period of one and a half months between each round. An overview of what personal link belonged to which participant was retained, only accessible for the main investigator. Data was collected between September 2021 and January 2022 and subsequently stored on a secured Ghent University server. All results collected in Qualtrics were transferred round by round to an Excel document where data analyses could be performed. The survey was conducted in English to make sure both Flemish and French speaking experts could participate and to decrease the risk of translation bias, since the CanMEDS framework is in English. In addition, an English framework facilitates further investigation in an international context. To enhance response rates, participants who had not (fully) filled in the questionnaire were individually sent one or two reminder emails.25, 26, 29
Data analysis
After collection, all analyses were performed in Microsoft Excel. For the first round, percentages of relevance were calculated. Using the 6-point Likert scale, competencies with a score from 4 to 6 were rated as relevant. Consensus was reached when at least 70% of the participants gave a score of 4 or higher. This threshold of 70% has been repeatedly used in previous Delphi studies.26, 30, 32, 33 During the second round, competencies were included when at least 70% of experts agreed not to change the competency. All competencies that did not reach consensus were collected and analysed based on the qualitative comments received. Qualitative comments were analysed and subsequently categorised into themes through inductive content analysis.34 Based on the analysis of the qualitative comments, the third round of the questionnaire was constructed.25, 28 Competencies with a 70% consensus to accept the adjustments in round three were included in the final competency framework.
Results
Demographics
A group of 122 experts was approached, of which 25 responded. After the first round, 17 out of 25 participants completed the questionnaire and 2 participants started but did not finish (73%). For the second round, 19 requests were sent, and 16 questionnaires were finished, of which 1 incomplete (84%). A total of 15 out of 16 experts (93%) participating in round two, completed the third round. The demographics of all participants who completed at least one round (n = 18) are listed in Table 1.
Study progress
An overview of the flow of the Delphi study can be found in Figure 2.
Round 1
The new framework, consisting of 91 competencies, was sent to the participants to assess for relevance. A consensus of 100% was reached for 40 out of 91 competencies. A consensus of >70% was reached for 50 competencies, only one competency did not reach the 70% consensus threshold.
A total of 38 qualitative comments were given, these could be clustered into 6 categories: general remarks on the study (n=4), comments on a clerical error (n=6), suggestions to adjust the phrasing (n=11), questions about practical feasibility (n=5), suggestions to merge or split competencies (n=10) and additional information about their own scoring (n=2).
Round 2
During round 2, all 91 competencies were supplemented with their scores from round 1 and their specific comments. Participants were asked to judge if a competency should be adjusted or not, based on the provided feedback.
A consensus of 100% was reached for 29 competencies to retain the original formulation. No consensus was reached for 14 competencies, meaning there was a request for adjustment.
The remaining 47 competencies received at least a 70% consensus. The competency that did not reach consensus on relevance during the first round, did also not reach consensus to be excluded from the competency framework during the second round. Therefore, this competency was added to the list of competencies that needed adjustment. In total 15 competencies were selected to be adjusted.
There were 51 qualitative comments in the second round. All comments were analysed within the research group, but only those referring to the competencies without consensus (n=15) were taken into account. The remaining comments (n=36) were judged to be not significantly important to change a competency that already reached consensus.
The 15 relevant comments could be divided into 4 categories: suggestions how to adjust the competency (n=8), doubt about relevance for the trainee (despite being scored as relevant during the first round) (n=2), ideas to apply the competency into practice (n=2) and additional information about their own scoring (n=3). Based on these comments and in agreement with the research group, 10 competencies were adjusted, and 5 competencies were left unchanged. All 15 competencies without consensus can be found in Table 2, together with the adjustments (if carried out) and the reasoning behind implementation or non-implementation of the adjustments.
Round 3
During the third and last round, all 15 competencies reached consensus, of which 7 reached a 100% consensus. One adjusted competency, namely “Promote a culture that recognises, supports, and responds effectively to colleagues in need by facilitating the process to help”, received 4 comments saying that the implemented adjustment was unnecessary. Nevertheless, a consensus of 73,3% was reached, therefore the adjustment was retained. The final version of the competency framework is listed in Table 3.
Discussion
A new competency framework for postgraduate training in obstetrics and gynaecology was merged based on three pre-existing frameworks and reached consensus through a Delphi study including Belgian experts.
When using a Delphi study for qualitative research, it is important that a correct study design is used to increase content validity. During this Delphi study only 15 participants completed the entire study, although the researchers aimed to include 30 participants. Since a Delphi study is very time consuming, it is possible that only experts with special interests in the topic or experts with more free time than average participated, increasing the risk of bias. However, all five expert groups as listed prior to the onset of the study were represented by the study population, which reflects a representative and qualitative group.31 A response rate of at least 70% was reached for every round and the level of consensus was defined prior to the study onset. We can state that the content validity of this Delphi study is adequate, based on the research design that was used.
Based on this study, we hoped to compile a more integrated competency framework that could form the basis of a standardised national training program in postgraduate training in obstetrics and gynaecology. By integrating both general and discipline-specific competencies within one competency framework, trainees can take more control over their training and evaluation moments can be better substantiated. Thanks to the predetermined competencies, it is clearer for trainees and supervisors what is expected of them.1
A similar study was conducted within the paediatric discipline, where consensus was reached on a very similar competency framework.23 This confirms that a competency framework, as compiled in this study, is also supported by experts within their field in other disciplines. Moreover, this is a confirmation that a holistic competency framework like this could potentially serve as a basis for curriculum reform of all specialist training programs in Belgium.
Since two international validated frameworks were used to create the new framework, namely the CanMEDS framework and the UEMS framework, this framework should be considered as relevant in a more international context. Once it has been implemented in a new workplace-based learning curriculum in Belgium, it could form the basis of curriculum and assessment reform in other countries. The use of a postgraduate training curriculum, based on the same competency framework, that transcends national boundaries can enhance quality and transparency in medical education.
One of the limitations of this framework is that although only one competency did not reach consensus for relevance, multiple comments indicated that the competencies are too broad or difficult to assess in clinical practice. This weakness of the CanMEDS framework has already been acknowledged in earlier literature.11, 19 Moreover, the lack of appropriate assessment tools seems to be inherent in CBME.2, 8, 19, 35 It is important to limit the list of concrete learning outcomes when developing an assessment method. Otherwise, it may lead to ‘checkbox education’, where the holistic view of the original competencies is lost in an endless list of abilities.3, 8 Therefore, further investigation is needed to create an appropriate assessment tool to implement these competencies in clinical practice. One possibility would be the use of entrustable professional activities (EPAs) as an assessment method.36 EPAs are descriptors of work of which both the process and outcome can be assessed, it can be used as a way to translate enabling competencies into clinical practice. The competencies are achieved gradually in five levels of proficiency, in this manner all trainees have their own individualised learning curves and potential areas of concern could be addressed earlier.
The lists with specific gynaecological and obstetrical skills and knowledge added to the holistic competency framework were added as an example during this Delphi study but are not yet updated or validated. A subsequent investigation to optimise and validate these lists is required to create a complete picture of the postgraduate training in obstetrics and gynaecology.
Conclusions
This article provides a new integrated competency framework for postgraduate training in obstetrics and gynaecology, based on three pre-existing frameworks. Consensus on this framework was reached through a Delphi study. Such a holistic competency framework can form the basis of a curriculum reform in the postgraduate training in obstetrics and gynaecology within Belgium, but also in a more international context. Since a similar competency framework for paediatric postgraduate training also reached consensus in an earlier study, this competency framework could potentially serve as the basis for a curriculum reform of all specialist training programs in Belgium.
The competency framework in its current form is too extensive to use during workplace learning. Following steps should focus on optimization and validation of the specific gynaecological and obstetrical skills and knowledge, and on creating a qualitative assessment tool to integrate this competency framework into practice.
Acknowledgements
The authors would like to acknowledge the contribution of Dr. Mieke Embo, Ms. Vasiliki Andreou, Ms. Oona Janssens and Ms. Sofie Van Ostaeyen for their expert opinions during the development of the competency framework. We would like to thank our participants for the investment of their time in this study.
Conflicts of Interest
The authors declare they have no conflicts of interest.
References
- Harden RM, Laidlaw JM. Essential skills for a medical teacher: an introduction to teaching and learning in medicine. 3rd ed. Poland: Elyse O'Grady; 2021. Chapter 2, What is outcome- or competency-based education? [Cited 30 July 2024]; Available from: https://www.google.be/books/edition/Essential_Skills_for_a_Medical_Teacher/qPjqDwAAQBAJ?hl=nl&gbpv=1.
- Carraccio C, Wolfsthal SD, Englander R, Ferentz K and Martin C. Shifting paradigms: from Flexner to competencies. Acad Med. 2002; 77: 361-367.
Full Text PubMed - Gruppen LD, Mangrulkar RS and Kolars JC. The promise of competency-based education in the health professions for improving global health. Hum Resour Health. 2012; 10: 43.
Full Text PubMed - van der Vleuten CPM. Competency-based education is beneficial for professional development. Perspect Med Educ. 2015; 4: 323-325.
Full Text PubMed - Hawkins RE, Welcher CM, Holmboe ES, Kirk LM, Norcini JJ, Simons KB and Skochelak SE. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Educ. 2015; 49: 1086-1102.
Full Text PubMed - Ten Cate O. Competency-Based Postgraduate Medical Education: Past, Present and Future. GMS J Med Educ. 2017; 34: 69.
Full Text PubMed - Dagnone JD, Chan MK, Meschino D, Bandiera G, den Rooyen C, Matlow A, McEwen L, Scheele F and St Croix R. Living in a World of Change: Bridging the Gap From Competency-Based Medical Education Theory to Practice in Canada. Acad Med. 2020; 95: 1643-1646.
Full Text PubMed - Garofalo M and Aggarwal R. Competency-Based Medical Education and Assessment of Training: Review of Selected National Obstetrics and Gynaecology Curricula. J Obstet Gynaecol Can. 2017; 39: 534-544.
Full Text PubMed - Nguyen VT and Losee JE. Time- versus Competency-Based Residency Training. Plast Reconstr Surg. 2016; 138: 527-531.
Full Text PubMed - Swing S. ACGME Launches Outcomes Assessment Project. JAMA. 1998; 279: 1492.
Full Text - Michels NR, Denekens J, Driessen EW, Van Gaal LF, Bossaert LL and De Winter BY. A Delphi study to construct a CanMEDS competence based inventory applicable for workplace assessment. BMC Med Educ. 2012; 12: 86.
Full Text PubMed - The Royal College of Physicians and Surgeons of Canada. Objectives of training in the specialty of obstetrics and gynecology, 2016 [Cited 27 October 2024]; Available from: https://www.royalcollege.ca/content/dam/documents/ibd/obstetrics-and-gynecology/obgyn-otr-2016-e.pdf.
- (NVOG) Nederlandse Vereniging voor Obstetrie en Gynaecologie. Landelijk Opleidingsplan Gynaecologie en Obstetrie (LOGO), 2021 [Cited 26 October 2024]; Available from: https://nvog-logo.nl/wp-content/uploads/2024/01/LOGO_fullEN_version921_1223.pdf.
- Eggermont J, Van Raemdonck D, Goffin J. The medical curriculum at KU Leuven-University of Leuven. Clinical Medical Education. 2014;1(1):19-27.
- (UEMS) European Union of Medical Specialists. European Training Requirements in Obstetrics and Gynaecology, 2018 [Cited 26 October 2024]; Available from: https://drive.google.com/file/d/1gII5RSrrbr0MPVnhVswdLuqGPj1Drhk4/view?usp=drive_link.
- SBO Scaffold. ePortfolios to support workplace learning in healthcare education. [Cited 26 October 2024]; Available from: https://www.sbo-scaffold.com/en.
- (MSG) Master Specialistische Geneeskunde. Leerresultatenkaart MSG, 2014 [Cited 26 October 2024]; Available from: https://drive.google.com/file/d/0BzVOpAairWg_Y2pxYlZBSFUxNjg/view?usp=sharing&resourcekey=0-Btu_V2ksmfmQnFc_JD63VQ.
- Bharathan R, Ghai V and Ind T. Obstetrics and gynaecology trainees' perceptions of the CanMEDS expertise model: implications for training from a regional questionnaire study in the United Kingdom. J Obstet Gynaecol. 2020; 40: 1138-1144.
Full Text PubMed - Chou S, Cole G, McLaughlin K and Lockyer J. CanMEDS evaluation in Canadian postgraduate training programmes: tools used and programme director satisfaction. Med Educ. 2008; 42: 879-886.
Full Text PubMed - Jilg S, Möltner A, Berberat P, Fischer MR and Breckwoldt J. How do Supervising Clinicians of a University Hospital and Associated Teaching Hospitals Rate the Relevance of the Key Competencies within the CanMEDS Roles Framework in Respect to Teaching in Clinical Clerkships? GMS Z Med Ausbild. 2015; 32: 33.
Full Text PubMed - Ringsted C, Hansen TL, Davis D and Scherpbier A. Are some of the challenging aspects of the CanMEDS roles valid outside Canada? Med Educ. 2006; 40: 807-815.
Full Text PubMed - Van der Aa JE, Goverde AJ and Scheele F. Improving the training of the future gynaecologist: development of a European curriculum in Obstetrics and Gynaecology (EBCOG-PACT). Facts Views Vis Obgyn. 2018; 10: 1-2.
PubMed - Robbrecht M, Norga K, Van Winckel M, Valcke M and Embo M. Development of an integrated competency framework for postgraduate paediatric training: a Delphi study. Eur J Pediatr. 2022; 181: 637-646.
Full Text PubMed - Day J, Bobeva M. A generic toolkit for the successful management of Delphi studies. Electronic Journal of Business Research Methods. 2005;3(2):103‑116.
- Gill FJ, Leslie GD, Grech C and Latour JM. Using a web-based survey tool to undertake a Delphi study: application for nurse education research. Nurse Educ Today. 2013; 33: 1322-1328.
Full Text PubMed - Hasson F, Keeney S and McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs. 2000; 32: 1008-1015.
PubMed - Skulmoski GJ, Hartman FT, Krahn J. The Delphi method for graduate research. Journal of Information Technology Education. 2007;6(1):1-21.
- Thangaratinam S and Redman CWE. The Delphi technique. The Obstetrician and Gynaecologist. 2005; 7: 120-125.
Full Text - Keeney S, Hasson F and McKenna H. Consulting the oracle: ten lessons from using the Delphi technique in nursing research. J Adv Nurs. 2006; 53: 205-212.
Full Text PubMed - Cowman S, Gethin G, Clarke E, Moore Z, Craig G, Jordan-O'Brien J, McLain N and Strapp H. An international eDelphi study identifying the research and education priorities in wound management and tissue repair. J Clin Nurs. 2012; 21: 344-353.
Full Text PubMed - Hsu C, Sandford BA. The Delphi technique: making sense of consensus. Practical Assessment, Research, and Evaluation. 2007;12(1):10.
- Bobonich M and Cooper KD. A Core Curriculum for Dermatology Nurse Practitioners: Using Delphi Technique. J Dermatol Nurses Assoc. 2012; 4: 108-120.
Full Text PubMed - Foth T, Efstathiou N, Vanderspank-Wright B, Ufholz LA, Dütthorn N, Zimansky M and Humphrey-Murto S. The use of Delphi and Nominal Group Technique in nursing education: A review. Int J Nurs Stud. 2016; 60: 112-120.
Full Text PubMed - Elo S and Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008; 62: 107-115.
Full Text PubMed - Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007; 29: 648-654.
Full Text PubMed - Shah N, Desai C, Jorwekar G, Badyal D and Singh T. Competency-based medical education: An overview and application in pharmacology. Indian J Pharmacol. 2016; 48: 5-9.
Full Text PubMed - Yousuf MI. Using experts opinions through Delphi technique. Practical Assessment, Research, And Evaluation. 2007;12(1):4.