Eastern Mediterranean Health Journal | Articles in press | Research articles | Compliance with design principles: a case study of a widely used laboratory information system

Compliance with design principles: a case study of a widely used laboratory information system

Print PDF

PDF version

Zhila Agharezaei1, Reza Khajouei2, Leila Ahmadian3, Laleh Agharezaei4

1Department of Medical Informatics, Faculty of Medicine, Mashhad University of Medical Sciences, Mashhad, Islamic Republic of Iran. 2Health Services Management Research Center, Institute for Futures Studies in Health, Kerman University of Medical Sciences, Kerman, Islamic Republic of Iran. 3Medical Informatics Research Center, Institute for Futures Studies in Health, Kerman University of Medical Sciences, Kerman, Islamic Republic of Iran. 4Modeling in Health Research Center, Institute for Futures Studies in Health, Kerman University of Medical Sciences, Kerman, Islamic Republic of Iran. (Correspondence to: L. Ahmadian: This e-mail address is being protected from spambots. You need JavaScript enabled to view it ).


Background: Laboratory information systems (LISs) are widely used health information systems that have the potential to improve healthcare quality. Despite their benefits, many studies have indicated problems with user interaction with these systems due to poor interface design.

Aims: This study aimed to evaluate usability of an LIS.

Methods: In this descriptive, cross-sectional study, we used heuristic evaluation to examine the user interface design of an LIS in an academic hospital affiliated with Kerman University of Medical Sciences in 2017. This system is also used in 59 other Iranian hospitals. We investigated the usability of different parts of LISs (outpatient admission, inpatient admission, sample collection, and test result reporting). Data were collected using a standard form based on the heuristic evaluation method, and categorized based on their severity and violated heuristics. The content validity was confirmed by 3 medical informatics specialists.

Results: We identified 162 usability problems. In terms of the heuristics, the highest number of problems concerned flexibility and efficiency of use (n = 32, 19.75%) and the lowest concerned help users recognize, diagnose, and recover from errors (n = 2, 1.23%). In terms of different modules of the system, the highest number of problems (n = 51, 31.48%) concerned outpatient admission and the lowest (n = 29, 17.9%) concerned sample collection. In terms of severity, 45.06% of the problems were rated as major.

Conclusions: Despite widespread use of LISs, their user interface design has usability problems that diminish the quality of user interaction with these systems and may affect the quality of health care. Consideration of standards and principles for user interface design, such as the heuristics used in this study, could improve system usability.

Keywords: laboratory information system, usability evaluation, user interface, human–computer interaction, heuristic evaluation.

Citation: Agharezaei Z; Khajouei R; Ahmadian L; Agharezaei L. Compliance with design principles: a case study of a widely used laboratory information system. East Mediterr Health J. 2020;xx(x):xxx–xxx. https://doi.org/10.26719/emhj.20.029

Received: 22/08/18; accepted: 24/04/19

Copyright © World Health Organization (WHO) 2020. Open Access. Some rights reserved. This work is available under the CC BY-NC-SA 3.0 IGO license (https://creativecommons.org/licenses/by-nc-sa/3.0/igo)


The health system of the Islamic Republic of Iran functions in an environment with rapidly changing social, economic and technical factors (1). The private and public sectors both provide healthcare services; however, the public sector, especially the Ministry of Health, plays a more important role in this regard (2). Comparative studies of healthcare systems in successful countries, introduced by the World Health Organization (WHO), and using their experiences, will assist the Islamic Republic of Iran to achieve a prosperous health system (1). Hospital information systems (HISs) are some of the most important and widely used information systems in health care (3). HISs are used for collecting, processing and retrieving patient information from different sources and using it for clinical and management decision-making (3,4). These systems can improve the quality of care and increase patients’ safety and providers’ efficiency (4–6). Most HISs contain subsystems including inpatient, outpatient, emergency, pharmacy, accounting, radiology, laboratory and medical records (7).

Laboratory information systems (LISs) are HISs that can be used to order tests, process samples, receive results, and create and communicate reports (8). LISs can improve laboratory processes and documentation accuracy of laboratory test results. Nevertheless, research has shown that the use of LISs can have major errors (8–10). Laboratory errors can lead to wrong diagnosis, inappropriate care, delayed treatment, poor clinical research, increased costs, and endanger patients’ lives (9). Many of these errors are related to usability problems of the LISs (11–14). So, identifying and then preventing the usability problems of LISs seem essential (15). The interface design of an LIS can have a dramatic effect on user interaction and satisfaction with the system (16–18). Evaluation of the system can be used for user interface redesign, to improve user acceptance and eliminate major problems in the system.

Heuristic evaluation (HE) is one of the usability evaluation methods. In this method, the compliance of an IS user interface design with some recognized standards is evaluated (19). Several studies have used HE in the healthcare sector, including evaluation of electronic health records (20), LISs (12,15,21,22) and radiology information systems (22).

To make working with ISs convenient, efficient and satisfactory, a series of standard principles should be followed in user interface design. The evaluation of systems according to these principles, which is called HE, was developed by Nielsen in 1990 (19). In this method a group of evaluators examine the user interface and judge its compliance against recognized principles (the heuristics) (23). According to Nielsen, 3–5 expert evaluators can identify an average of 74–87% of the problems (19,23). In Nielsen’s method, 10 main principles (heuristics) are used for evaluation of ISs.

Many studies have reported high usability problems that have a negative effect on user–system interaction (12,20,24). Several studies have evaluated the usability of LISs in developed countries (10,12,13,15,21), but it has not been sufficiently studied in some developing countries such as the Islamic Republic of Iran (22). Considering the importance of LISs in patient safety, the objective of this study was to evaluate the user interface design of an LIS in Kerman University of Medical Sciences, as an example of LISs used in Iranian hospitals.


In this study the user interface design of an LIS was evaluated in an academic hospital in 2017. This LIS is a subsystem of an HIS used actively in 60 hospitals (45 general and academic and 15 private) in the Islamic Republic of Iran. This LIS included the following 4 parts: outpatient admission, inpatient admission, sample collection and result reporting. The features of the LIS included patient records, patient information editing, patient search, test search, list of tests, sample collection editing, blood requests and delivery, quality control, and paraclinical services. This system was accessed in Bahonar Hospital in Kerman. We evaluated the user interface of the LIS in general; therefore, the results are not specific to this hospital.

Four evaluators independently evaluated the LIS using the Neilson principles (23). Evaluators included 2 medical informatics professionals, expert in usability evaluation, and 2 other evaluators who had master’s degrees in information technology management and information systems management with a background in software engineering. The latter 2 were trained in HE. Each evaluator examined conformity of different parts of the LIS with heuristics based on the method proposed by Neilson (23) and recorded any problems in a data collection form. The content validity of the form was confirmed by 3 medical informaticians. This form consisted of a table including problem name, problem description, problem location, and violated heuristic columns. After identification of the problems, their severity was determined according to Nielsen (25), based on 3 factors: frequency, impact and persistence (Table 1).

The collected data from independent evaluations were compared and duplications were removed by evaluators from the identified problems. Then, the problem was inserted into 2 separate lists in terms of violated heuristics and evaluated modules of the system. The number of evaluators was recorded in front of the identified problem. Any disagreement about identified problems and the allocation of them to each heuristic was discussed and resolved in a joint meeting. Then data were analysed using SPSS version 22 (SPSS Inc., Chicago, IL, USA).


HE of the LIS using Neilson’s principles identified 162 problems. The number of single problems were 68, of which, 46 were repeated in more than 1 part of the system. From the 68 single problems, 9 (13.24%) were identified by only 1 evaluator, 42 (61.76%) by 2 evaluators and 17 (25%) by all 4 evaluators.

Most of the identified problems were related to the flexibility and efficiency of use principles (n = 32, 19.73%), followed by aesthetic and minimalist design and visibility of system status (n = 27, 16.66%), consistency and standards (n = 36, 16.04%), error prevention (n = 15, 9.25%), recognition rather than recall (n = 11, 6.79%), help and documentation (n = 9, 5.55%), match between system and real world (n = 8, 4.93%), user control and freedom (n= 5, 3.08%), and the lowest number of problems was related to helping users recognize, diagnose and recover from errors (n = 2, 1.23%) (Table 2). Considering the different parts of the LIS, the greatest number of problems was related to outpatient (n = 51, 31.48%) and inpatient (n = 47, 29.01%) parts, and the least, was related to sample collection with 29 problems (17.9%) from a total of 162.

Table 3 presents the average severity of the problems related to each heuristic and the frequency of problems based on their severity. The average severity of problems concerning 4 heuristics, flexibility and efficiency of use, match between system and real world, user control and freedom, and help and documentation was major, while the average severity of problems related to other heuristics was minor. The maximum average severity of problems was 3.2, related to flexibility and efficiency of use, and the minimum was 2.0, related to aesthetic and minimalist design. Major severity was the most common (n = 73, 45.06%), followed by minor severity (n =48, 29.62%), cosmetic severity (n = 22, 13.58%) and catastrophic severity (n = 19, 11.72%) (Table 3). Some of these problems, if they continue, can have negative effects on user performance, such as fatigue, confusion, and wasting time. This can cause errors and subsequently reduce the quality of care and patient safety (Table 4).

Problems identified by Nielsen’s principles

Visibility of system status

Problems concerning nonconformity to this principle were more frequent in Inpatient and Outpatient Admission sections. These problems included selecting inappropriate titles in windows; lack of the horizontal scroll bar in presentation of patients’ search information; unspecified hierarchy under windows; and not showing row number of task list in reports. Some of these problems, such as the first and second, were identified by all 4 evaluators.

Match between the system and real world

Two problems associated with this principle confused the users and wasted their time: (1) nonidentical order of displayed tests on the computer and on the printed results sheet (this can cause mistakes while entering the results, demanding a lot of time to review the input data); and (2) inconsistency of the pop-up message with the actual task executed (such that, in the delivered samples section, when the user clicked on Removal of the Delivered Sample, a deletion message popped up; however, nothing actually happened), and on the test orders window, the classification of the test was only possible into 2 categories of Emergency Room and Post-Cardiac Care Unit.

User control and freedom

Compared to other principles, problems associated with this area seemed fewer; 2 of which were: (1) inaccessibility to the previous or next windows (there was only the back button in some windows; otherwise, users needed to close the window and start over); and (2) inability to remove a sample in the delivered samples section; the latter confused the users more than the former.

Consistency and standards

Violation of this principle was mostly related to the sections Inpatient and Outpatient Admission and Test Orders. Some problems included: inconsistency of the window titles (Farsi/English/No Title); inconsistent application of highlighting techniques (e.g., bolding text); different information display for test orders in inpatient and outpatient sections. The first problem was identified by all 4 evaluators.

Error prevention

Problems arising from violation of this principle were found mostly in the Outpatient Admission and Results sections. Some examples were: failure to prevent entering of wrong data (the input field for year of birth provided 4 characters, yet, when entering the data, only the far right 2 characters were recorded); generating no error message for entering numerical data in letter fields; allowing the user to disable the year of birth field, and then, upon submission, displaying an error message for missing year of birth.

Recognition rather than recall

Problems concerning this principle were mostly detected int he Admission and Test Orders, as well as Results sections. Inability to identify the functions of the existing items on the previous orders of the patient and the Abbreviated Name field in the test orders window; unknown function of the field According to Patient List on the laboratory working list; separated and scattered insurance data; and lack of a title for the sidebar checkbox for the list of tests on the test orders window were some of these problems.

Flexibility and efficiency of use

Problems arising from ignoring this principle mostly involved the Test Orders section. Fifty percent of them were identified by all 4 evaluators. Some problems were: inability to magnify windows; lack of settings to accommodate user preferences (colour, font, window size, and customization of the program); and difficulty using the vertical scrollbar. Among the problems rendering the interface challenging for the users were: invisibility of the titles of patient information columns (some information remained invisible even after widening the cells); inability to print the test fees (users had to manually write down the test fees one by one on the prescription).

Help users recognize, diagnose and recover from errors

Problems concerning this principle had the lowest frequency, and were mostly centred on the Test Orders section. An example of this was application of inappropriate messages in response to the user’s action. For instance, failure to record the serial number of patients insured by the Armed Forces Insurance Organization resulted in an inappropriate message reading: ‘no match found in the patient search, which did not help the user to understand and solve the problem.

Aesthetic and minimalist design

Nonobservance of this principle caused problems throughout the system. Some major problems were: application of small font size, overly light-coloured text, crammed information on the test orders page, and confusing page titles. Also, in some pages, titles were outside the box. A large load of useless information in advanced search results was another problem.

Help and documentation

Problems associated with violation of this principle were mostly related the Test Orders section. Some identified problems included: absent or inaccessible Help anywhere throughout the system (including help buttons, and descriptive, procedural, interpretational, and navigational information); untitled test lists and tables in previous orders in the Test Orders section.


HE of the LIS showed that, despite widespread usage throughout Iranian hospitals, it had a high number of usability problems. If some of these problems were to continue, they could have negative effects on user performance, such as fatigue, confusion, and wasting time. This could cause errors and reduce the quality of care and patient safety. Similar studies have shown that problems such as incomprehensibility of the system design have negative effects on the quality of patient care (20,26).

In our study, most of the identified problems were related to violation of flexibility and efficiency of use (19.75%), visibility of system status (16.66%), aesthetic and minimalist design (16.66%), and consistency and standards (16.04%). The lowest number of problems was related to help users recognize, diagnose and recover from errors (1.23%). In some previous studies in developing and non-developed countries (13,20,22), most of the problems were related to consistency and standards. In contrast, in other studies (18,27) violation of flexibility and efficiency of use had the lowest number of problems.

The results show that the number of problems with major severity (45.06%) was the highest and the number with catastrophic severity was the lowest (11.72%). In previous studies (18,28), the severity of problems was classified as major and catastrophic, and they had a greater number of catastrophic violations compared with our study. In two other similar studies (29,30), the severity of problems was classified as major.

Our results confirm those of other studies. The surveys of Alanaziet al. (15) and Mathews et al. (21) using System Usability Scale (SUS) showed that usability of LISs was poor. Most evaluation methods have reviewed user satisfaction and attitude about HISs (31); compliance with the users’ needs (32); the impact of HISs on the factors influencing quality of clinical services, hospital performance, and work processes (7); and usability and efficiency of HISs (33). Evaluations conducted in these previous studies used questionnaires, interviews and check lists, and did not address the usability problems that users may have encountered during actual interaction with the computer.

Our results show that many of the problems with existing ISs are preventable by following the standards and principles for designing systems. Major problems of ISs include inconsistency between system messages and actions, nonvisible column headings for patients, and inability to print the cost of laboratory tests. These problems can be solved by definition of correct and clear messages corresponding to their functions, accurately defining scrollbars and correctly designing patient data entry forms, and providing printing output for the costs of laboratory tests. In accordance with the results of this study, we observed that because of poor usability and difficulty in working with the LIS, users refuse to use some sections and perform their tasks manually.

Expertise in evaluation and in the domain of the studied system results in better identification of problems (29,34). Two of our evaluators were medical informaticians with a long history of working in healthcare, and usability evaluation skills, and the other 2 evaluators were proficient in computer systems. This increased the reliability of our findings.

Nielsen (19) has shown that increasing the usability of user interfaces requires regular evaluation and updating indifferent stages of the system development lifecycle (19). The updating process must include adding new functions and troubleshooting of the existing problems, which increases users’ understanding of the system. This type of evaluation can be done using an inexpensive and simple evaluation method such as HE (35). It is easy to train the evaluators and it produces large amounts of information. Also, HE can be carried out in a short time, so the list of problems is quickly available (19,35).

This study had two limitations. First, we evaluated a widely used LIS (in 60 hospitals) at a single hospital. Since it was not checked whether the other 59 hospitals use the same version of the system or an upgraded or customized version, generalization of the results to the other LISs should be done with caution. However, we believe that since most and the main functionalities and features of the systems are similar, this cannot significantly affect our results. Second, HE identifies problems that mostly hinder novice users’ interaction with a system. Therefore, if the users have received extensive training and supervision specifically concerning how to work around or be careful of the user interface issues that was found in this study, these problems should not be critical threats for the outcomes of users’ interaction, such as clinical actions. Future studies can check this issue by evaluating the effect of user interface problems on actual users’ actions.


We have shown that conducting a usability evaluation can help identify the origin of problems causing new errors, users’ fear of operating the system, and their resistance to it. These problems occur due to noncompliance of ISs design with the accepted standards and principles, which may impede user–IS interaction. A defective or unsuccessful interaction leads to an undesirable experience when operating the system and result in poorer quality of care. Our results can be used to obviate the identified problems in redesign process and prevent them in the new ISs.


The following recommendations may improve and correct ISs.

Providing printable output for test fees; documentation and accessibility of help; appropriate design of patient information forms; proper definition of search options; use of attractive techniques including bold colours, text size, fonts, etc.; hierarchical categorization and structuring to prevent information overload; use of specific and short titles for pages; definition of icons proportionate to the expected function; and preventing user errors in the system.

Conducting user need assessment prior to IS design to ensure compliance with user needs, and conducting usability studies throughout all design and development stages.

Providing developers with design standards prior to commencing design to improve system usability.


We would like to thank all the managers and experts of the Computer and Informatics Department, laboratory staff, and staff of the Medical Documentation and Statistics Department of Bahonar Hospital.

Funding: None.

Competing interests: None declared.


  1. Khangah HA, Jannati A, Imani A, Salimlar SH, Derakhshani N, Raef B. Comparing the health care system of Iran with various countries. Health Scope. 2016;6(1):e34459. http://dx.doi.org/10.17795/jhealthscope-34459
  2. Mehrdad R. Health system in Iran. JMAJ. 2009;52(1):69–73. http://www.med.or.jp/english/pdf/2009_01/069_073.pdf
  3. Haux R. Health information systems - past, present, future. Int J Med Inform. 2006 Mar–Apr;75(3–4):268–81. http://dx.doi.org/10.1016/j.ijmedinf.2005.08.002 PMID:16169771
  4. Agharezaei ZH, Bahaadinbeigy K, Tofighi SH, Agharezaei L, Nemati A. Attitude of Iranian physicians and nurses toward a clinical decision support system for pulmonary embolism and deep vein thrombosis. Comput Methods Programs Biomed. 2014 Jul;115(2):95–101. http://dx.doi.org/10.1016/j.cmpb.2014.03.007 PMID:24768080
  5. Hamborg KC, Vehse B, Bludau HB. Questionnaire based usability evaluation of hospital information systems. Electron J Inform Syst Eval. 2004;7(1):21–30.
  6. Agharezaei L, Agharezaei ZH, Nemati A, Bahaadinbeigy K, Keynia F, Baneshi MR, et al. The prediction of the risk level of pulmonary embolism and deep vein thrombosis through artificial neural network. Acta Inform Med. 2016 Oct;24(5):354–9. http://dx.doi.org/10.5455/aim.2016.24.354.359 PMID:28077893
  7. Ahmadian L, Salehi Nejad S, Khajouei R. Evaluation methods used on health information systems (HISs) in Iran and the effects of HISs on Iranian healthcare: A systematic review. International journal of medical informatics. 2015 Jun;84(6):444–53. http://dx.doi.org/10.1016/j.ijmedinf.2015.02.002 PMID:25746766
  8. Snydman LK, Harubin B, Kumar S, Chen J, Lopez RE, Salem DN. Voluntary electronic reporting of laboratory errors. Am J Med Qual. 2012 Mar–Apr;27(2):147–53. http://dx.doi.org/10.1177/1062860611413567 PMID:21918013
  9. Blaya JA, Shin SS, Yale G, Suarez C, Asencios L, Contreras C, et al. Electronic laboratory system reduces errors in National Tuberculosis Program: a cluster randomized controlled trial. Int J Tuberc Lung Dis.2010 Aug;14(8):1009-15. PMID:20626946
  10. Leen T, Erdogmus D, Kazmierczak S. Statistical Error Detection for Clinical Laboratory Tests. Conf Proc IEEE Eng Med Biol Soc. 2012;2012:2720–3. http://dx.doi.org/10.1109/EMBC.2012.6346526 PMID:23366487
  11. Peute LW, Jaspers MW. The significance of a usability evaluation of an emerging laboratory order entry system. Int J Med Inform. 2007 Feb–Mar;76(2-3):157–68. http://dx.doi.org/10.1016/j.ijmedinf.2006.06.003 PMID:16854617
  12. Peute LW, Jaspers MM. Usability evaluation of a laboratory order entry system: cognitive walkthrough and think aloud combined. Stud Health Technol Inform. 2005;116:599–604. PMID:16160323
  13. Yasini M, Duclos C, Lamy JB, Venot A. Facilitating access to laboratory guidelines by modeling their contents and designing a computerized user interface. Stud Health Technol Inform. 2011;169:487–91. PMID:21893797
  14. Khajouei R. Hajesmaeel Gohari S. Mirzaee M. Comparison of two heuristic evaluation methods for evaluating the usability of health information systems. J Biomed Inform. 2018 Apr;80:37–42. http://dx.doi.org/10.1016/j.jbi.2018.02.016 PMID:29499315
  15. Alanazi, F. Evaluating the usability of the laboratory information system (LIS) in Coombe Hospital and Hail Hospital [thesis]. Dublin: Dublin Institute of Technology; 2015.
  16. Ebnehoseini Z, Tara M, Meraji M, Deldar K, Khoshronezhad F, Khoshronezhad S. Usability evaluation of an admission, discharge, and transfer information system: a heuristic evaluation. Open Access Maced J Med Sci. 2018 Nov 10;6(11):1941–5. http://dx.doi.org/10.3889/oamjms.2018.392 PMID:30559840
  17. Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J Biomed Inform. 2003 Feb–Apr;36(1–2):23–30. http://dx.doi.org/10.1016/s1532-0464(03)00060-1 PMID:14552844
  18. Atashi A, Khajouei R, Azizi A, Dadashi A. User Interface problems of a nationwide inpatient information system: a heuristic evaluation. Applied clinical informatics. 2016 Feb 17;7(1):89–100. http://dx.doi.org/10.4338/ACI-2015-07-RA-0086 PMID:27081409
  19. Nielsen J. Technology transfer of heuristic evaluation and usability inspection. 1995 (www.useit.com/papers/heuristic/learning_inspection.html, accessed 23 January 2020).
  20. Georgsson M, Staggers N, Weir CH. A Modified User-Oriented Heuristic Evaluation of a Mobile Health System for Diabetes Self-Management Support. Comput. Inform. Nurs. 2016 Feb;34(2):77–84. http://dx.doi.org/10.1097/CIN.0000000000000209 PMID:26657618
  21. Mathews A, Marc D. Usability evaluation of laboratory information systems. J Pathol Inform. 2017 Oct 3;8:40. http://dx.doi.org/10.4103/jpi.jpi_24_17 PMID:29114434
  22. Nabovati E, Vakili-Arki H, Eslami S, Khajouei R. Usability evaluation of Laboratory and Radiology Information Systems integrated into a hospital information system. J Med Syst. 2014 Apr;38(4):35. http://dx.doi.org/10.1007/s10916-014-0035-z PMID: 4682671
  23. Nielsen, J. How to Conduct a Heuristic Evaluation. 1994
  24. (http://www.useit.com/papers/heuristic/heuristic_evaluation.html, accessed 23 January 2020).
  25. Kushniruk AW, Triola MM, Borycki EM, Stein B, Kannry JL. Technology induced error and usability: the relationship between usability problems and prescription errors when using a handheld application. Int J Med Inform. 2005 Aug;74(7–8):519–26. http://dx.doi.org/10.1016/j.ijmedinf.2005.01.003 PMID:16043081
  26. Nielsen J. Severity ratings for usability problems. 1995 (https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/, accessed 23 January 2020).
  27. Ellsworth MA, Dziadzko M, O’Horo JC, Farrell AM, Zhang J, Herasevich V. An appraisal of published usability evaluations of electronic health records via systematic review. J Am Med Inform Assoc. 2017 Jan;24(1):218–26. http://dx.doi.org/10.1093/jamia/ocw046 PMID:27107451
  28. Farzandipour M, Nabovati E, Zaeimi GH, Khajouei R. Usability Evaluation of Three Admission and Medical Records Subsystems Integrated into Nationwide Hospital Information Systems: Heuristic Evaluation. Acta Inform Med. 2018 Jun;26(2):133–8. http://dx.doi.org/10.5455/aim.2018.26.133-138 PMID:30061787
  29. Choi B, Drozdetski S, Hackett M, Lu C, Rottenberg C, Yu L, et al. Usability comparison of three clinical trial management systems. AMIA Annu Symp Proc. 2005:921. PMID:16779208
  30. Joshi A, Arora M, Dai L, Price K, Vizer L, Sears A. Usability of a patient education and motivation tool using heuristic evaluation. J Med Internet Res. 2009 Nov 6;11(4):e47. PMID:19897458
  31. Pressler TR, Yen PY, Ding J, Liu J, Embi PJ, Payne PR. Computational challenges and human factors influencing the design and use of clinical research participant eligibility prescreening tools. BMC Med Inform Decis Mak. 2012 May 30;12:47. http://dx.doi.org/10.1186/1472-6947-12-47 PMID:22646313
  32. Azizi A, Safari Sh, Mohammadi A, Kheirollahi J, Shojaei Baghini M. A survey on the satisfaction rate of users about the quality of hospital information system in hospitals associated with Kermanshah University of Medical Sciences. Health Inform Manage. 2011 Oct–Nov;8(4):571–80.
  33. Ahmadi M, Hosseini F, Barabadi M. A Survey on the compatibility of the Hospital Information Systems (HIS) with the needs of medical records users from the system. J Health Admin. 2006;11(32):25–32 (in Persian).
  34. Ahmadi M, Shahmoradi L, Barabadi M, Hoseini F. Usability evaluation of hospital information systems based on IsoMetric 9241. Hakim Res J. 2011;13(4):226–33 (in Persian).
  35. Horsky J, McColgan K, Pang JE, Melnikas AJ, Linder JA, Schnipper JL, et al. Complementary methods of system usability evaluation: surveys and observations during software design and development cycles. J Biomed Inform. 2010;43(5):782–90. http://dx.doi.org/10.1016/j.jbi.2010.05.010
  36. Nielsen J, Phillips V.L. Estimating the relative usability of two interfaces: heuristic, formal and empirical methods compared. Publication: CHI '93: Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems May 1993:214–21 (https://doi.org/10.1145/169059.169173, accessed 23 January 2020).