Checklist pour l’utilisation de la méthode COBIT 5 for risk

maxresdefaultPréparation à l’analyse de risque
avec COBIT 5

Créé par Marc-André Léger
Pour plus d’information: marcandre@leger.ca

EDM03.1 Evaluate risk management

  • Est-ce que lappétence au risque est appropriée ?
  • Les risques informationnels sont-ils identifiés et gérés ?

EDM03.2 Direct risk management

  • Est-ce qu’un comité de gouvernance du risque informationnel est en place ?
  • Est-ce que l’organisation a mis en place de pratiques exemplaires de gestion de risque informationnel appropriées ?

L’organisation a :

  • Réalisé un inventaire des actifs informationnels
  • Catégorisé les actifs informationnels
  • Identifié les interdépendances
  • Mis en place d’un comité de projet pour l’analyse de risque informationnel
  • Produit un plan de projet pour l’analyse de risque informationnel
  • Déterminé les cadres de gestion qu’elle souhaite mettre en oeuvre
  • Fait un état de la situation et un audit

Phase didentification de l’analyse de risque

APO12.01.1.1

  • Quel est notre modèle de collecte de données ?
  • Quest-ce qui est dans le périmètre de l’analyse de risque ?
  • Comment allons-nous catégoriser et les analyser les risques ?

APO12.01.1.2

  • Comment s’assurer d’avoir une bonne couverture des différents risques ?
  • Est-ce suffisant ?
  • Est-ce que les scénarios de risques sont dans le périmètre de létude ?

APO12.01.1.3

  • Utilise-t-on une approche systématique ?
  • A-t-on cherché des sources de données probantes ?
  • Que fait-on pour limiter les biais ?

APO12.01.1.4

  • Quels sont les critères pour s’assurer que le modèle puisse soutenir la mesure et l’évaluation ?
  • Est-ce quon fait la promotion dune culture du risque ?

APO12.01.2.1

  • Quelles sont les données sur l’environnement d’exploitation de l’organisation qui pourraient jouer un rôle important dans la gestion des risques informationnels ?
  • Avons-nous suffisamment dinformations pour prendre une décision ?

APO12.01.2.2

  • Avons-nous consulté toutes les parties prenantes au sein de lorganisation ?
  • Quelles autres parties pourrions-nous consulter ?

APO12.01.2.3

L’organisation a identifié quels sont :

  • les principales sources de revenus,
  • les systèmes informatiques externes,
  • la responsabilité légale, réglementaire,
  • nos concurrents,
  • les tendances dans l’industrie informatique,
  • la maturité de l’activité principale et des capacités TI
  • les questions géopolitiques.

APO12.01.2.4 et APO12.01.3

  • As-ton identifié les données historiques sur les risques informationnels ?
  • Quel est l’expérience des pertes de lindustrie ?
  • Dispose-t-on de sources de données probantes sur lindustrie ?

APO12.01.3.1

  • Dispose-t-on de données probantes sur les aléas ?

APO12.01.3.2

  • Conserve-t-on des informations sur les incidents, problèmes et enquêtes impliquant des actifs informationnels ?

APO12.01.4.1

  • Les données probantes sont-elles organisées afin de mettre en évidence les facteurs contributifs ?

APO12.01.4.2

  • Quelles conditions existaient ou n’existaient pas lorsque les aléas sont survenus ?
  • Comment ces conditions ont-elles affecté la fréquence des aléas et l’ampleur des pertes ?

APO12.01.4.3

  • Quels sont les facteurs communs aux aléas ?
  • As-ton effectué périodiquement une analyse des vulnérabilités ?

L’analyse de risque

APO12.02.1.1

  • Quelle est profondeur attendue des efforts d’analyse des risques
  • Considère-ton un large éventail doptions ?
  • Est-ce proportionnel au niveau de maturité en gestion de risque informationnel ?

APO12.02.1.2

  • As-ton identifié les vulnérabilités pertinentes, la criticité des actifs informationnels pour l’organisation et les déclencheurs des aléas sur le terrain ?

APO12.02.1.3

  • As-ton fixé des objectifs d’optimisation des efforts d’analyse des risques ?
  • As-ton favorisé une vue étendue basée sur les processus daffaires, les extrant et les structures internes ?

APO12.02.1.4

Est-ce que la portée de l’analyse de risques tiens compte de :

  • la criticité pour l’organisation,
  • le coût des mesures par rapport à la valeur des actifs informationnels,
  • la réduction de lincertitude,
  • les exigences réglementaires globales.

APO12.03.2.1

  • Quels services informatiques sont essentiels à lorganisation ?
  • Quelles infrastructure informatique sont essentielles lorganisation ?

APO12.03.2.2

  • Quelles sont les dépendances TI-Processus daffaires ?
  • Quels sont les maillons faibles ?

APO12.03.2.3

  • As-ton un consensus des unités d’affaires et des gestionnaires TI sur les actifs informationnels critiques ?

Création de scénarios

APO12.04.1.1

  • Quels processus sont en place pour coordonner (au besoin) des activités d’analyse de risque supplémentaire ?

APO12.04.1.2

  • Comment évaluons-nous les rapports coût/bénéfices ?

APO12.04.1.3

  • Comment identifions-nous les impacts négatifs des aléas et des scénarios ?
  • Comment évaluons-nous les effets positifs des aléas et des scénarios ?
  • Comment tenons-nous compte de ces données dans nos processus décisionnels ?

APO12.04.2

Fournir aux décideurs les données qui leurs permettront de comprendre :

  • le pire des cas
  • les scénarios les plus probables,
  • les risques en matière de diligence raisonnable,
  • les risques réputationels significatifs,
  • les considérations légales ou réglementaires.

APO12.04.2.1

  • Est-ce que les décideurs ont en main les éléments qui leurs permettront de juger les scénarios ?
  • Est-ce quil les comprennent ?

Phase de priorisation

APO12.02.4.1

  • As-t’on effectué une revue par les pairs de l’analyse de risque ?

APO12.02.4.2

  • Est-ce que l’analyse est suffisamment documentée ?

APO12.02.4.3

  • Est-ce que les estimations sont appuyées par des données probantes ?
  • As-t’on mis en place des mesures afin de contrôler les biais ?

APO12.02.4.4

  • Quels sont les mécanismes mis en oeuvre de mécanismes pour contrôler les biais ?
  • Y-a-t’il des possibilités de manipuler le processus ?
  • La recherche de données probantes fut-elle exhaustive ?

APO12.02.4.5

  • Le niveau d’expérience et les qualifications de l’analyste de risque étaient-elles appropriés ?

APO12.02.4.6

  • Dispose-t’on d’une opinion professionnelle sur l’analyse de risque ?
  • Est-ce que la réduction des risques inacceptables atteint le niveau attendu ?
  • Est-ce que le coût du processus d’analyse de risque est raisonnable ?

APO12.03.3.1

  • Est-ce qu’on dispose d’un inventaire des processus, des compétences et les connaissances ?
  • As-t’on évalué la capacité des processus, les compétences et les connaissances des individus de l’organisation.

APO12.03.3.1

Est-ce que les résultats et les performances furent évalués à travers le spectre du risque informationnel:

  • TCO
  • coûts des projets,
  • coûts des opérations et de la prestation de services TI.

APO 12.03.3.2

  • As-t’on déterminé si les bons contrôles sont en place ?

APO12.03.3.3

As-t’on identifié où et comment la variabilité des résultats associés à un processus a un effet sur:

  • contrôle interne
  • la qualité de l’information
  • la performance de l’organisation
  • la capacité à saisir des opportunités

APO12.03.4.1

  • Est-ce que les variables et les métriques sont identifiés et définies ?
  • Quelles sont leurs interconnexions avec les catégories d’impact ?

APO12.03.4.2

  • Est-ce que les données sont ajustées en fonction de l’évolution du risque et des menaces émergentes ?

APO12.03.4.3

  • Est-ce que le coût de mise à jour des actifs informationnels sur la base de leur criticité est identifié ?
  • Est-ce que les liens entre les aléas et leur impact est déterminé ?

APO12.03.4.

  • As-t’on catalogué et agrégé les aléas par catégorie, secteur d’activité et secteur fonctionnel de l’organisation ?

APO12.03.4.5

  • As-t’on mis en place un processus de mise à jour annuel des scénarios de risque informationnel afin de répondre aux changements internes ou externes ?

APO12.03.5.1

  • Est-ce que l’organisation utilise un registre des risques informationnels ?
  • As-t’on cartographié les risques d’entreprise (ERM) ?

APO12.03.5.2

  • Bonifie-t’on le profil de risque par les résultats de la portion TI de l’évaluation du risque d’entreprise (ERM) ?

APO12.03.5.3

  • Est-ce qu’on a des processus de mise à jour des attributs des éléments du registre des risques informationnels ?

APO12.03.6.1

  • Quelles sont les métriques et les KRI qui peuvent cibler des aléas significatifs ?

APO12.03.6.2

  • Est-ce que ces indicateurs sont basés sur un modèle qui permet d’intégrer les variables susceptibles de comprendre l’exposition et les capacités de l’organisation ?

APO12.03.6.3

  • Est-ce que les KRI sont compris par l’ensemble des parties prenantes ?

APO12.03.6.4

  • As-t’on de processus d’examen et de révision périodique des KRI ?

APO12.04.4.1

  • Comment sont évalué les besoins de transfert de risque ou la nécessité d’analyse de risque supplémentaire ?

APO12.04.4.2

  • Comment les plans d’actions correctives influences le profil de risque global ?

APO12.04.4.3

  • Identifie-t’on les possibilités d’intégration avec des projets et des activités de gestion des risques en cours ?

Phase de mobilisation

APO12.04.3.1

  • Comment sont identifiés les besoins des différentes parties prenantes en matière de rapports sur l’évolution du risque ?
  • Est-ce qu’on applique des principes de pertinence, l’efficacité, la fréquence et l’exactitude des rapports ?

APO12.04.3.2

Est-ce que les rapports documentent:

  • l’efficacité et de performance, les problèmes et les lacunes, l’état des mesures de mitigation, les aléas, les incidents et leur impact sur le profil de risque et la performance des processus de gestion des risques.

APO12.04.3.3

  • Est-ce que les données des rapports de risque TI contribuent à la gestion intégrée des risques d’entreprise ?

APO12.05.1.1

  • Est que l’organisation a réalisé un inventaire des contrôles en place ?

APO12.05.1.2

  • Est-ce que les contrôles sont classés aux relations aux risques informationnels spécifiques ?

APO12.05.2.1

  • Est-ce que l’organisation surveille l’alignement opérationnel avec les seuils de tolérance au risque ?

APO12.05.2.2

  • Est-ce que chaque ligne d’affaires accepte la responsabilité de ses niveaux de tolérances au risque ?
  • Dispose-t’on outils de surveillance des processus clés ?

APO12.05.2.3

  • Est-ce qu’on surveille l’efficacité des contrôles ?
  • Est-ce qu’on mesure la variance de seuils par rapport aux objectifs ?

APO12.05.3.1

  • Quelles sont les réponses à l’exposition au risque ?

APO12.05.3.2

  • Choisi-t’on des contrôles basés sur des menaces spécifiques, l’exposition au risque, les pertes probables et les exigences spécifiées dans les normes ?

APO12.05.3.3

  • Comment suit-t’on l’évolution des profils de risque sous-jacents ?

APO12.05.3.4

  • Comment communique-t’on avec les principaux intervenants ?

APO12.05.3.5

  • Est-ce que l’on fait des projets pilotes pour valider l’efficacité à réduire les risques des mesures de mitigation de risque ?

APO12.05.3.6

  • Comment sont gérées les mises à jour des contrôles opérationnels ?
  • Dispose-t’on de mécanismes de mesure de la performance des contrôles ?
  • Est-ce qu’on a prévu des actions correctives au besoin ?

APO12.05.3.7

  • Qu’est-ce qui est prévu pour former le personnel sur les nouvelles procédures lors de leur déploiement ?

APO12.05.3.8

  • Est-ce qu’il y a un suivi d’avancement des projets du plan directeur ?
  • Comment s’assure-t’on de l’efficacité des actions ?
  • Qu’est qui est prévu pour valider l’acceptation du risque résiduel ?

APO12.05.3.9

  • Est-ce qu’on s’assure que le propriétaire des processus concernés prend leurs responsabilités en matière de gestion de risque ?
  • Comment les écarts sont-ils signalés à la haute direction ?

Créé par Marc-André Léger en 2015, tout droits réservés

Scénarios de risque génériques de COBIT 5

Portfolio establishment and maintenance

  • 0101 Wrong programmes are selected for implementation and are misaligned with corporate strategy and priorities.
  • 0102 There is duplication between initiatives. Aligned initiatives have streamlined interfaces.
  • 0103 A new important programme creates longterm incompatibility with the enterprise architecture.
  • 0104 Competing resources are allocated and managed inefficiently and are misaligned to business priorities.

Programme/projects life cycle management

  • 0201 Failing (due to cost, delays, scope creep, changed business priorities) projects are not terminated.
  • 0202 There is an IT project budget overrun. The IT project is completed within agreed-on budgets.
  • 0203 There is occasional late IT project delivery by an internal development department.
  • 0204 Routinely, there are important delays in IT project delivery.
  • 0205 There are excessive delays in outsourced IT development project.
  • 0206 Programmes/projects fail due to not obtaining the active involvement throughout the programme/project life cycle of all stakeholders (including sponsor).

IT investment decision making

  • 0301 Business managers or representatives are not involved in important IT investment decision making (e.g., new applications, prioritisation, new technology opportunities).
  • 0302 The wrong software, in terms of cost, performance, features, compatibility, etc., is selected for implementation.
  • 0303 The wrong infrastructure, in terms of cost, performance, features, compatibility, etc., is selected for implementation.
  • 0304 Redundant software is purchased.

IT expertise and skills

  • 0401 There is a lack of or mismatched IT-related skills within IT, e.g., due to new technologies.
  • 0402 There is a lack of business understanding by IT staff affecting the service delivery/projects quality.
  • 0403 There are insufficient skills to cover the business requirements.
  • 0404 There is an inability to recruit IT staff. The correct amount of IT staff, with appropriate skills and competencies is attracted to support the business objectives.
  • 0405 There is a lack of due diligence in the recruitment process.
  • 0406 There is a lack of training leading to IT staff leaving.
  • 0407 There is insufficient return on investment regarding training due to early leaving of trained IT staff (e.g., MBA).
  • 0408 There is an overreliance on key IT staff. Job rotation ensures that nobody alone possesses the entire knowledge of the execution of a certain activity.
  • 0409 There is an inability to update the IT skills to the proper level through training.

Staff operations (human error and malicious intent)

    • 0501 Access rights from prior roles are abused. HR and IT administration co-ordinate on a frequent basis to ensure timely removal of access rights, avoiding the possibility of abuse.
    • 0502 IT equipment is accidentally damaged by staff.
    • 0503 There are errors by IT staff (during backup, during upgrades of systems, during maintenance of systems, etc.).
    • 0504 Information is input incorrectly by IT staff or system users.
    • 0505 The data centre is destroyed (sabotage, etc.) by staff.
    • 0506 There is a theft of a device with sensitive data by staff.
    • 0507 There is a theft of a key infrastructure component by staff.
    • 0508 Hardware components were configured erroneously.
    • 0509 Critical servers in the computer room were damaged (e.g., accident, etc.).
    • 0510 Hardware was tampered with intentionally (security devices, etc.).

Information (data breach: damage, leakage and access)

  • 0601 Hardware components are damaged, leading to (partial) destruction of data by internal staff.
  • 0602 The database is corrupted, leading to retained at a second location.
  • 0603 Portable media containing sensitive data (CD, USB drives, portable disks, etc.) is lost/disclosed.
  • 0604 Sensitive data is lost/disclosed through logical attacks.
  • 0605 Backup media is lost or backups are not checked for effectiveness.
  • 0606 Sensitive information is accidentally disclosed due to failure to follow information handling guidelines.
  • 0607 Data (accounting, security-related data, sales figures, etc.) are modified intentionally.
  • 0608 Sensitive information is disclosed through email or social media.
  • 0609 Sensitive information is discovered due to inefficient retaining/archiving/disposing of information.
  • 0610 IP is lost and/or competitive information is leaked due to key team members leaving the enterprise.
  • 0611 The enterprise has an overflow of data and cannot deduct the business relevant information from the data (e.g., big data problem).

Architecture (architectural vision and design) 

  • 0701  The enterprise architecture is complex and inflexible, obstructing further evolution and expansion leading to missed business opportunities.
  • 0702 The enterprise architecture is not fit for purpose and not supporting the business priorities.
  • 0703 There is a failure to adopt and exploit new infrastructure in a timely manner.
  • 0704 There is a failure to adopt and exploit new software (functionality, optimisation, etc.) in a timely manner.

Infrastructure (hardware, operating system and controlling technology) (selection implementation, operations and decommissioning)

  • 0801 New (innovative) infrastructure is installed and as a result systems become unstable leading to operational incidents, e.g., Bring your own device (BYOD) programme.
  • 0802 The systems cannot handle transaction volumes when user volumes increase.
  • 0803 The systems cannot handle system load when new applications or initiatives are deployed.
  • 0804 Intermittently, there are failures of utilities (telecom, electricity).
  • 0805 The IT in use is obsolete and cannot satisfy new business requirements (networking, security, database, storage, etc.).
  • 0806 Hardware fails due to overheating.

Software

  • 0901 There is an inability to use the software to realise desired outcomes (e.g., failure to make required business model or organisational changes).
  • 0902 Immature software (early adopters, bugs, etc.) is implemented.
  • 0903 The wrong software (cost, performance, features, compatibility, etc.) is selected for implementation.
  • 0904 There are operational glitches when new software is made operational.
  • 0905 Users cannot use and exploit new application software.
  • 0906 Intentional modification of software leading to wrong data or fraudulent actions.
  • 0907 Unintentional modification of software leads to unexpected results.
  • 0908 Unintentional configuration and change management errors occur.
  • 0909 Regular software malfunctioning of critical application software occurs.
  • 0910 Intermittent software problems with important system software occur.
  • 0911 Application software is obsolete
  • 0912 There is an inability to revert back to former versions in case of operational issues with the new version.

Business ownership of IT

  • 1001 Business does not assume accountability over those IT areas it should, e.g., functional requirements, development priorities, assessing opportunities through new technologies.
  • 1002 There is extensive dependency and use of end-user computing and ad hoc solutions for important information needs, leading to security deficiencies, inaccurate data or increasing costs/inefficient use of resources.
  • 1003 Cost and ineffectiveness is related to IT related purchases outside of the procurement process.
  • 1004 Inadequate requirements lead to ineffective service level agreements (SLAs).

Supplier selection performance, contractual compliance, termination of service and transfer

  • 1101 There is a lack of supplier due diligence regarding financial viability, delivery capability and sustainability of supplier’s service.
  • 1102 Unreasonable terms of business are accepted from IT suppliers.
  • 1103 Support and services delivered by vendors are inadequate and not in line with the SLA.
  • 1104 Outsourcer performance is inadequate in a large-scale long-term outsourcing arrangement.
  • 1105 There is non-compliance with software licence agreements (use and/or distribution of unlicenced software, etc.).
  • 1106 There is an inability to transfer to alternative suppliers due to overreliance on current supplier.
  • 1107 Cloud services are purchased by the business without the consultation/involvement of IT, resulting in inability to integrate the service with in-house services.

Regulatory compliance

  • 1201 There is non-compliance with regulations, e.g., privacy, accounting, manufacturing.
  • 1202 Unawareness of potential regulatory changes have an impact on the operational IT environment.
  • 1203 The regulator prevents cross-border dataflow due to insufficient controls.

Geopolitical

  • 1301 There is no access due to disruptive incident in other premises.
  • 1302 Government interference and national business value.
  • 1303 Targeted action against the enterprise results in destruction of infrastructure.

Infrastructure theft or destruction

  • 1401 There is a theft of a device with sensitive data.
  • 1402 There is a theft of a substantial number of development servers.
  • 1403 Destruction of the data centre (sabotage, etc.) occurs.
  • 1404 There is accidental destruction of individual devices.

Malware

  • 1501 There is an intrusion of malware on critical operational servers.
  • 1502 Regularly, there is infection of laptops with malware.
  • 1503 A disgruntled employee implements a time bomb that leads to data loss.
  • 1504 Company data are stolen through unauthorised access gained by a phishing attack.

Logical attacks

  • 1601 Unauthorised users try to break into systems.
  • 1602 There is a service interruption due to denial-of-service attack.
  • 1603 The web site is defaced.
  • 1604 Industrial espionage takes place.
  • 1605 There is a virus attack.
  • 1606 Hacktivism takes place.

Industrial action

  • 1701 Facilities and building are not accessible because of a labour union strike.
  • 1702 Key staff is not available through industrial action (e.g., transportation strike).
  • 1703 A third party is not able to provide services because of strike.
  • 1704 There is no access to capital caused by a strike of the banking industry.

Environmental

  • 1801 The equipment used is not environmentally friendly (e.g., power consumption, packaging).

Acts of nature

  • 1901 There is an earthquake.
  • 1902 There is a tsunami.
  • 1903 There are major storms and tropical cyclones.
  • 1904 There is a major wildfire.
  • 1905 There is flooding.
  • 1906 The water table is rising.

Innovation

  • 2001 New and important technology trends are not identified.
  • 2002 There is a failure to adopt and exploit new software (functionality, optimisation, etc.) in a timely manner.
  • 2003 New and important software trends are not identified (consumerisation of IT).

Bibliographie

ISACA (2013), COBIT 5 for RISK, disponible en ligne http://www.isaca.org/COBIT/Pages/Risk-product-page.aspx?cid=1002152&Appeal=PR

Altair 8800

Le Atlair dans Popular electronics
Le Atlair dans Popular electronics

En 1975, le constructeur américain MITS (Micro Instrumentation and Telemetry Systems) lançait le premier micro-ordinateur destiné aux particuliers le Altair 8800, qui utilisait le microprocesseur Intel 8080. Ayant fait la une de la célèbre revue Popular Electronics, le Altair suscita beaucoup d’intérêt auprès du public. Bill Gates et Paul Allen développèrent une version du langage BASIC pour le Altair, ce qui leur permit de fonder l’éditeur de logiciels Microsoft. Le Altair 8800 est arrivé sur le marché juste au bon moment. À cette époque, plusieurs universités américaines obligeaient les étudiants en sciences et en génie de suivre des cours d’informatique. Ceci, en plus de l’intérêt grandissant du public, signifiait qu’il y avait de nombreux clients potentiels avec les ressources financières et les habiletés techniques pour utiliser un tel appareil.

Altair 8800
Altair 8800

Caractéristiques du Altair

  • Constructeur : MITS
  • Model : Altair 8800
  • Date de lancement : janvier 1975
  • Pays d »origine : États-Unis
  • Prix : 400$ en kit, 600$
  • Processeur Intel 8080 et 8080 A
  • Vitesse : 2 MHz
  • Mémoire vive : 256 octets extensible à  64 ko
  • ROM : optionnelle
  • Sauvegarde optionnelle : lecteur de bande perforée, de cassette, ou disquette 5.25 pouces
  • Extension : 4 ports sur la carte mère extensible à 16 ports total
  • Bus : S-100
  • Vidéo : jeux de diodes sur la face avant
  • Clavier : jeux d’interrupteurs sur la face avant
  • Port I/O : série et parallèle
  • Système d’exploitation : CP/M, MS-DOS, Altair Disk BASIC

Manuels de l’Altair 8800

On the application of Nash’s Equilibrium to Healthcare Information Risk Management

Track: Privacy, security, confidentiality and protection of healthcare information

Marc-André Léger, DESS, MScA (MIS),
Université de Sherbrooke, Sherbrooke, Québec, Canada, marcandre@leger.ca

Through a case scenario approach, this article seeks to demonstrate the inadequacies of current Risk Assessment Methodologies used today. In particular, Risk Assessment Methodologies used in a Healthcare setting fail to adequatly weigh the value of ethical and public health. Therefore different approaches, relying on different paradigms, could be used. Two possible candidates are proposed, Prospect Theory and Nash’s Equilibrium.

Keywords

Health, information, Security, Risk, Management

1. Introduction

Healthcare professionals and organisations need information. Evidence based medicine; healthcare system administration and medical research all rely on data produced throughout the system. Information systems are increasingly present in all aspects of clinical practice, in administrative functions and in many other areas. The emerging importance of the Electronic Health Record (EHR) as well as the increase in the use of information technology (IT) in healthcare activities is progressively providing access to large quantities of data concerning patients, health care delivery and research [1] [2] [3]. Because of this reliance on information and because information needs various technologies to carry it, information security has become an issue. Healthcare organizations, to perform optimally, with regularity, over time, need to identify the predictable [4], they need to manage risks associated to its need for information. This is what we call Healthcare Informational Risk Management or HIRM.

There are many additional factors that justify the need for HIRM. The limited availability of financial and human resources, motivates organizations to be very careful about how it allocates them.  Because Information Technology can be expensive to acquire and maintain and require specialized resources, HIRM can be used as part of the solution to keep their costs under control. For some organisations, laws and regulations impose specific requirements as to the preservation of privacy and confidentiality. The nature of Healthcare imposes requirements for availability, for example that availability of a patient’s bedside chart during rounds. Integrity of information can also be a sensitive issue, for example between the blood types A and B there is a single bit difference but if that bit is changed inadvertently, it can have severe consequence for a patient receiving the wrong blood. All of these reasons and others require Healthcare organisations to implement a Risk Management program.  For others, contractual requirements are a catalyst. Managing risks is paramount to accurate financial reporting and optimal decision-making [5]. For these and other motivation, there has been significant interest in the Québec Healthcare system to find a HIRM solution, expressed through Request for Proposals (RFP) published in the last year. As well, many regional jurisdictions in Canada, the Federal Government and Canada Health Infoway have shown interest in HIRM. In this article, we look at HIRM through an example in our specific local context, we therefore present an overview of the Québec Healthcare system.

2. Methodology

This discussion paper presents a research hypothesis that has emerged from on-going research being performed as a requirement for the obtention of a Doctorate degree in Clinical Sciences at the Faculty of Medecine of the University of Sherbrooke. The article uses a case scenario approach to illustrate a research problem that has evolved into a hypothesis.

3. Case scenario

3.1 Brief overview of the Québec Healthcare system

Since 1971, the Québec Health and Social Services Agency, known as the MSSS (Ministered de la Santé et de Services Socaux), has been the sole provider of Healthcare in the Province of Québec, where approximately 25% of the population of Canada reside. In 2005 it employed 269 600 individuals in 1786 worksites, representing 6.7% of the active population of Québec [6]. At the local level, 95 Health and Social Services Centers (CSSS) and associated Local Services Network (RLS) offer health and social services to a given population. In December 2004, the Act respecting local health and social services network development agencies (Bill 25) created CSSS by merging local community health centres (CLSCs), residential and long-term care centres (CHSLDs) and general and specialized hospital centres (CHSGSs). The objectives of health and social services centres are the following:

  • To promote health and well-being
  • To bring together the services offered to the public
  • To offer more accessible, better coordinated and seamless services
  • To make it easier for people to move through the health and social services network
  • To ensure better patient management, particularly of the most vulnerable users

All CSSS and RLS are connected to a province wide Healthcare network, known as the RTSS [26]. This private network implements a top down infrastructure with a national datacenter (TCN) linking several regional datacenters (TCR). Local CSSS and Healthcare establishments have Local Area Networks that are connected to the TCR of their region. Typical Information management services, like email or word processing, are provided at the establishment level. In some cases, databases are supported by database management systems shared (s-DBMS) among multiple establishments within a TCR. Internet access is provided through firewalls located at the TCN level.

3.2. Our scenario

To illustrate the problem we find in HIRM, we present a simple scenario. A resident of the Province of Québec, in the city of Montréal, accesses the RLS through a nearby community health center (CLSC) for the flu. He is very worried because he watches the news and fears that he may have Bird flu (H5N1). Arriving at the reception, his identity is verified as he presents his Québec Medicare card. At this point the resident is considered a patient (P) and goes to an isolated waiting room while his Health Record (HR) is retrieved and until the appropriate Healthcare Professional (HP) becomes available. In the CLSC, the patient’s HR is in part on a paper support (covering pre-1998 visits by the patient) and electronic format (eHR). Due to the RTSS, part of the eHR is retrieved from a local database and another portion from the s-DBMS located in the Montreal TCR. Once P has met with the HP, in this case a General Practitioner MD, blood tests are ordered, the eHR is amended to include the new information and P leaves with a recommendation for rest and hydration. The HR suspects P has a common cold and may suffer from an anixiety related disorder. Once the laboratory results are returned, a few hours later, the patient is informed via phone by the CLSC that he has a simple cold. We will expand on this scenario through the remainder of this article to illustrate our hypothesis.

4. What is Risk?

The word risk finds it’s origins in the middle-age Italian word risco, meaning sharp rock. In the 17th century, as the early insurance companies where involved in maritime shipping, risk evolved from the sharp rocks that where a source of danger for ships [11]. Since the introduction of probabilities by Pascal and the early work of Rousseau on uncertainty [11], the idea has developed that risk is something that can be studied, it is not magical nor an Act of God. The Uncertainty of the future, a condition of affairs is designated by Knight [12] in the 1920’s with the term « risk ». Later authors [13] suggest that the terms risk and uncertainty have become interchangeable, and one can often be found in the description of the other. For Browning [15], risk stems from uncertainty surrounding potential future states and the consequences of those states should they occur. The US Army [14] defines risk as the accepted result of an informed decision and terms gamble an uninformed bet or guess on a hopeful outcome. In epidemiology [16], it is most often used to express the probability that a particular outcome will occur following a particular exposure. In a healthcare setting risk is composed of the following three component parts [17]:

  • Threat: The occurrence of which will represent a trigger event which may lead to an adverse (set of) consequence(s).
  • Vulnerability: A weakness in the system and – or the overall environment, which could be exploited by a threat occurrence.
  • Impact: the (set of) consequence(s) to a healthcare unit which could arise if a threat occurrence exploited vulnerability and adversely affected one or more assets comprising the system.

4.1 Formal Risk Assessment Methodologies

In an organisational setting, risk is managed in a mixture of formal and informal processes. Formal risk management processes are what we refer to as Risk Management. ISO Guide 73(2002) defines Risk Management as the coordinated activities used by an organisation to direct and control risk. It generally includes risk assessment, risk treatment, risk acceptance and risk communication activities [7][8][9] to balance the operational and economic costs of risk mitigation measures to maximize organisational benefits by protecting assets that support their mission [10].

Formal Risk Assessment Methodologies (FRAMs), such as CRAMM [18], determine risk as the “product” of the likelihood of a security incident affecting a particular asset and the impact cost. Similar approaches are used by methodologies such as MÉHARI or OCTAVE, used in Québec. IVRI™, developped by the author, also follows this approach. All of these methodologies are in a similar qualitative paradigm. In all of them the likelihood (probabilities) of a threat and the severity of the impacts must be determined by individuals in the organisations using Likert-like scales. This makes them subject to how individuals perceive risk and its components. According to Savage [20], the very assignment of numerical probabilities, even if subjective, implies that it represents choice under risk. These probabilities are expressions of what is ultimately belief and seem more like uncertainty. Matters where, according to John Maynard Keynes [21], there is no scientific basis on which to form any calculable probability whatever. In the field, we have observed that the subjective nature of uncertainty may introduce internal validity problems with FRAMs. We have observed noticable differences with individual determination of likelyhood or impact in vivo. We believe that this subjectivity is a potential source of error in risk assessment since there is little evidence of internal validity controls, similar to what is used in research methodologies (e.g. triangulation), in FARMs.

4.2. Validity issues in FRAMs

In a non-exaustive empirical analisys of risk assessment methodologies used in Québec organisations, we found little evidence of internal validy controls or that internal validity controls had been validated by the creators of these methodologies. Using validity criterias used in Clinical Research presented in Whittemore [27], illustrated on the table below, we believe that there is evidence that some criterias are not addressed.

Table 1: Primary and Secondary Criteria of Validity in Qualitative research [27]

Criteria Assessment
Primary criteria
Credibility Do the results of the research reflect the experience of participants or the context in a believable way?
Authenticity Does a representation of the emic perspective exhibit awareness to the subtle differences in the voices of all participants?
Criticality Does the research process demonstrate evidence of critical appraisal?
Integrity Does the research reflect recursive and repetitive checks of validity as well as a humble presentation of findings?
Secondary criteria
Explicitness Have methodological decisions, interpretations, and investigator biases been addressed?
Vividness Have thick and faithful descriptions been portrayed with artfulness and clarity?
Creativity Have imaginative ways of organizing, presenting, and analyzing data been incorporated?
Thoroughness Do the findings convincingly address the questions posed through completeness and saturation?
Congruence Are the process and the findings congruent?
Do all the themes fit together?
Do findings fit into a context outside the study situation?
Sensitivity Has the investigation been implemented in ways that are sensitive to the nature of human, cultural, and social contexts?

In particular we believe that criterias of Authenticity, Integrity, Vividness, Thoroughness, Congruence and Sensitivity appear problematic in relation to FRAMs we have examined. These should warrant empirical investigation to verify. We therefore believe that it is therefore necessary to look at decions about risk are made to try to understand how this source of error can be better understood.

4.3. Decisions about risk

Models of individual preferences about risk have their historical roots in the school of social philosophy known as Utilitarianism [22], proposed in the late 18th century. In Utilitarianism, the goal of all actions is to maximize general utility, with utility defined as any quantitative index of happiness satisfying certain basic properties. Utilitarian theory, neoclassical economic theory and game theory are the basic principals of rational choice theory or RCT [23]. The fundamental core of RCT is that social interaction is basically an economic transaction that is guided in its course by the actor’s rational choices among alternative outcomes. Decisions are taken only after its benefits and costs have been weighed, considering prices, probabilities and indivual preferences. The unit of analysis is the individual decision made by an individual decision maker. RCT defines rational actions of rational individuals as occurring under several constraints:

  • Scarcity of resources
  • Opportunity costs
  • Institutional norms
  • Information

In an organisational setting, with the classic top-down management structure, the sum of these individual decisions, with different weights in respect to the position of the decision taker, are what make it function. These individuals in a social, while using RCT to maximise utility, are affected by the above mentionned constraints and by other influence of a psychological and cultural nature as well as by external pressures. Risk Assessment Methodologies mentionned previously all implement processes to account for the consideration of the abovementionned constraints of RCT in risk assessment. In short, Rational Choice Theory is the theoritical model behind these methodologies. While we believe this is more due to historial and cultural reasons rather than on epistemological positionning, it appears to be supported by empirical evidence.

5. Application of RCT to our risk scenario

In this section of our article, we present an example of HIRM using Rational Choice Theory. This expands on the scenario presented earlier. In the scenaro we presented before there are several stakeholders. The most obvious ones are the Patient (P) and CLSC Staff. As well, TCR staff that manage the s-DBMS and MSSS staff that manage the Quebec Healthcare system are also stakeholders in our scenario. The informational assests that are involved are:

  • Québec Medicare card;
  • Health Record (HR) and electronic support (eHR);
  • The RTSS network;
  • A local database;
  • A s-DBMS located in the Montreal TCR;
  • Blood;
  • The laboratory results;

To illustrate the expected utility for each of the stakeholder category in relation to the relative value of the information assets, se we have build the table presented below. For this purpose we have made an informed guess at the relative value using data from a previous study [24] in a scenario where the informaiton asset was divulged or destroyed (high impact). We present only the most relevant results in table 1 for convinience.

Table 2: Relative value of information assets by stakeholder category.

  Medicare card HR RTSS Local Database s-DBMS
Patient Low High Low Low Low
Staff Low Replacement cost Low Replacement cost and data recovery Low
TCR staff Low Replacement cost High Low High
MSSS staff Replacement cost and misuse cost Replacement cost and potential privacy law suits Replacement cost Low Replacement cost

Let’s limit our scope to the eHR information asset. What is, in our scenario, the risk associated with the eHR. Risk, as we cited from Smith [17], consist of Threat, Vulnerability and Impact. If we consider the threat as the divulgation or destruction of the eHR and the Vulnerability as being use of the RTSS network to gain internet access to the eHR we can propose the following table.

Table 3: Impact of the divulgation of the eHR and Risk by stakeholder category

Stakeholder Impact of the divulgation of the eHR Risk
Patient Loss of privacy, a basic human right protected in Québec, potential for anxiety and financial loss. Loss of confidence in the Healthcare system. Potentially High
Staff May need to retype data retrieved from paper form (time) or restore backups (time) if available, possible sanctions by employer (generally minor unless criminal intent) Low
TCR staff May need to restore backups if available (time), sanctions by employer if there is direct responsibility (generally minor unless criminal intent). Low
MSSS staff 1000$ per incident in case of law suits for loss of privacy. Loss of reputation if the situation is made public, possible sanctions from a letter of reprimand to loss of employement or demotion (generally minor unless criminal intent) if a responsibility can be established. Low

In such a scenario, where the Threat to materialise, the risk would be a function of the impact. Comparing the impact of the realisation with the non-divulgation state we are in before we added the Threat, we would estimate the risk from each stakeholder’s point of view as has indicated in the last column of the previous table. If, as we cited previuosly, the objective of Risk Management is to balance the operational and economic costs of risk mitigation measures to maximize benefits by protecting the eHR, then from the point of view of each stakeholder the justification is for low risk mitigation expenditures, with the exception of the Patient. In the case MSSS, because the aggregate the combined risk of the 7 million residents of Québec, the combined risk cand be perceived as more significant. Using RCT to maximise utility, we would expect the MSSS and the patient to give more value to protecting the eHR, while it would have less value for other stakeholders. In the field and in previous empirical research [24], we have noticed that TCR staff, when given a choice, allocated more ressources to operating performance than to Risk Management activities. At the MSSS level, the suggested high aggegated risk may supported by reality, as significant effort is devoted to ensure that the link between the TCN and the InterNet is secure.The Patients, while concerned, have little say in HIRM..

5.1. Application of a Formal Risk Assement Methodology to our risk scenario

We performed a Formal Risk Assessment, in our scenario making basic assumptions based on our knowledge of the Québec Healthcare system from a previus study [24]. This was done using IVRI™. This methodology was choosen necause it was created by the author of this article and because it is available at no cost on the Internet (www.leger.ca) so this risk assessment scenario may be duplicated. It was published in french [8] in 2003 and uses a Spreadsheet to assist the Risk Assessment, which produces the graphic presented below. The IVRI Risk Index (IRi) was 1570 with a baseline (IB) at 727.

Figure 1 Estimated Risk by Threat category with IVRI™

By comparing the results obtained using the FRAM with the resultus of the application of RCT to our scenario, we believe that there is the appearance of congruance. A possible explanation for this is that FRAMs, such as IVRI, implements a form of RCT. This needs to be confirmed through empirical investigation. Our hypothesis is that FRAMs available today in Québec implement a form of Rational Choice Theory.

5.2. Where is the problem?

The problem that we see, having done research in the field of Healthcare Informatics, has to do with Ethics. In Healthcare, a long Ethical tradition, first expressed through the Hypocratical oath and reenforced by experice of the Nuremburg Code, has evolved to be suppported by law, Charters of rights and Codes of Deontology. The rights to privacy and confidentiality are intimately connected with the right to respect for one’s dignity, integrity and autonomy are constitutionally enshrined in the Canadian Charter of Rights and Freedoms and Quebec’s Charter of Human Rights and Freedoms [25]. They are the principal drivers of the requirement for adequate treatment of risk in healthcare organisations [26]. In assessing risk in an HIRM setting it is necessary to value Ethical considerations as well as the expected utility provided by RCT. In our previous scenario, Ethical considerations, such as loss of privacy if a single patient’s information is divulged, has a relatively low Risk for the MSSS, but has a high potential of Risk for the patient, as we illustrated in Table 3.

In a review of litterature, we have identified that there are several components to risk in a Healthceare setting [27][28][29][30][31][32][33][34]:

  • A requirement for formal Policies, Documentation and training;
  • Privacy;
  • Confidentiality;
  • Integrity;
  • Availability;
  • The presence of appropriate Safeguards;
  • Limiting Collection;
  • Processes to enable Challenging Compliance;
  • De-identification of data;
  • Secure transmission of data;
  • Management controls;
  • Accountability;
  • A requirement for Openness;
  • Informed consent;
  • Identifying Purposes of data collection;
  • Access to information by patients
  • right to withhold,
  • segregate;
  • amend; and
  • copy;
  • Limiting Use, Disclosure, and Retention;
  • Full disclosure (No secret databases shall exist);
  • Non-commercial use (No medical record shall be sold, utilized for marketing purposes without the prior informed consent of the individual).

In the context of the scenario we have presented, most of these components could be applicable. Either because of the leal obligations that affect the eHR, described in [35] or the ethical requirements of the Declaration of Helsinki [29], in an ideal scenario of Risk Assessment, all of these could have an influence of the potential for Risk. Of all of the components of Risk, many may not be easily assigned a value. How much value can be given to the quality of the informed consent? Perhaps a value can be put on Pivacy by refering to Jurisprudence, but it is likely to be always too low for the indivual victime and always too high for the responsible party. If we look at informed consent, we find that it may be difficult to determine, in HIRM activities, the quality of that consent. In many of these ethical issues our intuition suggests that there is no linearity between Threat and Impact but rather the Risk likely increases by increments, like a stairway with uneven steps. Because it is difficult to assign a value to these components of risk, any FRAM that use financial or replacement costs in the calculation of the Impact as a principal component of risk will necessarely undervalue all of the intangible components, components with little monetary value or difficult to perceive as an expected utility expressed in menetary terms. We believe there is a link between this problem and the validity issues mentionned earlier. We therefore make the hypothesis that FRAMs are, at best, of an undetermined acuracy.

6. Why not look at different approaches?

Throught a litterature review we have performed for an ongoing research project, we have identified areas of litterature that where not covered in the tradition fields of Healthcare or Information technology. This has led us to look at different theories that can be used to understand Risk Management. Looking to the field of econometric and epidemiology, we have seen evidence that there are risk assessment models that have been developped in other fields which could be applied to HIRM. We have not found any evidence that this possibility has been evaluted in the IT security litterature or in the Healthcare litterature. In our revue of litterature on risk, we have found that there are different theories that have been developped. We have also seen evidence that Game Theory is used in the field of insurance for risk calculation. So we are proposing the hypothesis that perhaps a different approach could be used to estimate HIRM. Looking in litterature, we suggest two possible condidates to replace RCT as the basis for risk assessment in FRAMs.

6.1. Possible candidate 1: Prospect Theory

According to Edwards [36], Prospect theory was formulated first by Kahneman and Tversky in 1979 as an alternative method of explaining choices made by individuals under conditions of risk, as a substitute for expected utility theory. Kahneman and Tversky realized the fact that the expected utility theory model did not fully describe the manner in which individuals make decisions in risky situations and that therefore, there were instances in which a decisionmaker’s choice could not be predicted. For example, they point out that Expected Utility does not explain the manner in which framing can change the decision of the individual, nor does it explain why individuals exhibit risk-seeking behavior in some instances and risk-averse behavior in others. Kahneman and Tversky [37] demonstrate that subjects’ choices of lotteries exhibit a wide range of anomalies that violate expected utility theory. Most importantly, they show that predictable and dramatic shifts in preference can be generated by changing the ways in which options are framed. Unlike traditional economic theories, which deduce implications from normative preferences, prospect theory takes an inductive and descriptive approach. Prospect theory can be viewed [36] as a parsimonious summary of most of the important risky choice anomalies. A HIRM approach based on Prospect Theory in wich Risk takes into account framing could be used, at least it could be possible. Futher study is required to explore this, but it shows a possibility to view risk in a different light than it is possible to do with Expected Utility. So we contend that there is at least one theory than can be used to identify risk and model decisions about risk.

6.2. Possible candidate 2: Nash’s Equilibrium

John Forbes Nash, made famous by Russel Crowe in the movie A beautifull mind, suggested that Expeted Utility may not be the best way to describe how individuals make risky decisions. ‘Adam Smith needs revision’, as is quoted in the movie. Nash [38] formally defined an equilibrium of a noncooperative game to be a profile of strategies, one for each player in the game, such that each player’s strategy maximizes his expected utility payoff against the given strategies of the other players. If the behavior of all the players in such a game can be predicted, then the prediction must be a Nash equilibrium, or else it would violate this assumption of intelligent rational individual behavior. Should the predicted behavior not satisfy the conditions for Nash equilibrium, then there must be at least one individual whose expected welfare could be improved by educating him to more effectively pursue his own best interests, without any other change [38]. It would appear that a Nash’s Equilibrium could better account for non monetary impacts of risk in Healtcare Information Systems, such as divulgation of private information, loss of live or other Ethical violations because it approaches the problem as one of strategy rather than as one of uncertainty. Rather than viewing risk as a function of threats, probability of realisation and impact, it could view risk as a variance from a dominant strategy in a non-cooperative game between an organisation (with information to protect), an environnement (wich may can dammage to the information systems) and a third player (competitors, hackers or malitious employees). Approaches based on Nash’s Equilibrium have been used in Econometrics to create mathematical models of economies, in other fields to develop models to assess insurance risk and in other fields. We believe that empirical research should be done to explore this possibility in HIRM. Such an approach could integrate the various monetary, non-monetary and ethical concerns of Healthcare organisations.

7. Conclusion

We are suggesting that different approaches to the HIRM problem should be considered. We believe that using a case scenario approach as we did in this article, we are able to demonstrate the likelyhood that the current approach is limited. It is our pretention that Nash’s Equilibrium could better account for non monetary impacts of risk in Healtcare Information Systems, such as divulgation of private information, loss of live or other Ethical violations. While we believe that empirical research should be done to explore this possibility, it is currently impossible to persue this hypothesis due, principally, to lack of funding. This paper intends to demonstrate, by a review of litterature, examples from different fields and discussion, that this idea has merit. Should this be proved, it could eventually significantly affect how IT Risk Management is done. Unfortunatly at this time there is no reserch funding available to persue this idea or ideas of this type in Healtcare Informatics Research or in the Information Technology feld. Because there are no economic incentives to do the kind of fundamental research needed to develop such an idea, private entreprise can’t persue this. We believe that this situation should be corrected.

References

[1] Anderson JG. Security of the distributed electronic patient record: a case-based approach to identifying policy issues, International Journal of Medical Informatics, 2002, pages 111–118

[2]  Safran, C., Goldberg, H., Electronic patient records and the impact of the Internet, International Journal of Medical Informatics, 2000, pages 77–83

[3]  Sujansky, W., Heterogeneous Database Integration in Biomedicine, Journal of Biomedical Informatics, 2001, pages 285–298

[4]  Watkins, M.D., Bazerman, M.H., Predictable Surprises: The Disasters You Should Have Seen Coming, Harvard Business review Online, 2003

[5]  Stoneburner, G., Goguen, A., Feringa, A., NIST Special Publication 800-30 Risk Management Guide for Information Technology Systems, Recommendations of the National Institute of Standards and Technology, July 2002

[6] Ministère de la Santé et de la Sécurité Sociale du Québec: www.msss.gouv.qc.ca

[7] Hancock, Bill, COMMON SENSE GUIDE FOR SENIOR MANAGERS, Top Ten Recommended Information Security Practices, 1st Edition, July 2002

[8] Léger, Marc-André, Méthodologie IVRI de gestion du risque en matière de sécurité de l’information, Éditions Fortier Communications, Montréal, Septembre 2003

[9] Schumacher, H. J., Ghosh, S., A fundamental framework for network security, Journal of Network and Computer Applications, 1997, pages 305–322

[10] Myerson, Judith, Risk Management, INTERNATIONAL JOURNAL OF NETWORK MANAGEMENT, 1999, pages 305-308

[11] Beucher, S., Reghezza, M., (2004) Les risques (CAPES Agrégation), Bréal

[12] Knight, Frank H. (1921) Risk, Uncertainty, and Profit, Boston, MA: Hart, Schaffner & Marx; Houghton Mifflin Company, 1921. [Online] available fromhttp://www.econlib.org/library/Knight/knRUP1.html ; accessed 11 December 2005

[13] Beck, Ulrich (1986) Risk Society: Towards a New Modernity, Sage Publications

[14] US Army (1998) US Army tarining manual FM100-14

[15] Browning, T. R. (1999) Sources of Schedule Risk in Complex System Development, Lean Aerospace Initiative at Massachusetts Institute of Technology, John Wiley & Sons

[16] Last JM, (2001) A dictionary of epidemiology. 4th edition. New York: Oxford University Press

[17] Smith, E., Eloff, J.H.P. (1999) Security in health-care information systems—current trends, International Journal of Medical Informatics, vol. 54, pp.39–54

[18] Jøsang, A., Bradley_, D.,  Knapsko, S. J. (2004) Belief-Based Risk Analysis, Australasian Information Security Workshop 2004 (AISW 2004), Dunedin, New Zealand. Conferences in Research and Practice in Information Technology, Vol. 32

[19] Cusson, R. (2002), Étude comparative des méthodologies d’analyse de risque, Conseil du Trésor du Québec

[20] Savage, L. (1954). The Foundations of Statistics. Dover, New York.

[21] Keynes, J.M. (1937) The General Theory of Empoyment, QJE

[22] Lo, A. W. (1999) The Three P’s of Total Risk Management, Financial Analysts Journal, January/February 1999, pp.13-26

[23] Levi, M. and als. (1990), The Limits of Rationality, University of Chicago Press, Chicago, Illinois in Zey, M. (1998) Rational Choice Theory and Organizational Theory: A Critique, Sage Publishing, February 1998

[24] Léger, Marc-André, Un processus d’analyse des vulnérabilités technologiques comme mesure de protection contre les cyber-attaques, Rapport d’activité de synthèse, Maîtrise en Informatique de Gestion, UQAM, Juin 2003, 110 pages

[25] CIHR (Canadian Institutesof Health Research), Secondary use of personal information in health research: Case studies, Canadian Institute of Health Research, November 2002

[26] MSSS, Ministère de la Santé du Québec, Le réseau RTSS C’est, site internet du MSSS, http://www.msss.gouv.qc.ca/rtss/, 2003

[27] Whittemore, R., Validity in qualitative research, Qualitative Health Research, vol 11, no 4, july 2001, pages 522-537.

[28] Belmont Report, Ethical Principles and Guidelines for the Protection of Human Subjects of Research, The National Commission for the Protection of Human Subjects of Biomedical andBehavioral Research, April 18, 1979

[29] WORLD MEDICAL ASSOCIATION, DECLARATION OF HELSINKI, Ethical Principles for Medical Research Involving Human Subjects, Helsinki, Finland, June 1964

[30] Nuremberg code, Directives for Human Experimentation, 1947

[31] Harkness, J., Lederer, S.E., Wikler, D., Laying ethical foundations for clinical research, Bulletin of the World Health Organization, 2001

[32] CSA (Canadian Standards Association), Model Code for the Protection of Personal Information (Q830-96) , 2003,http://www.csa.ca/standards/privacy/code/Default.asp?language=english

[33] CIHR (Canadian  Institutes of Health Research), Guidelines for Protecting Privacy and Confidentiality in the Design, Conduct and Evaluation of Health Research: BEST PRACTICES, CONSULTATION DRAFT, April 2004

 [34] Buckovich, Suzy A. et als, Driving Toward Guiding Principles: A Goal for Privacy, Confidentiality, and Security of Health Information, Journal of the American Medical Informatics Association Volume 6 Number 2 Mar / Apr 1999, Pages 122-133

[35] Boudreau, Christian et la CAI, Étude sur l’inforoute de la santé au Québec : Enjeux techniques, éthiques et légaux, document de réflexion, octobre 2001

[36] Edwards, K., Prospect Theory: A Literature Review, International Review of Financial Analysis, Vol. 5, No. 1, 1996, pages 19-38

[37] Laibson D., Zeckhauser, R. (1998) Amos Tversky and the Ascent of Behavioral Economics, Journal of Risk and Uncertainty, pp 7–47

[38] Myerson, Roger B. (1996) NASH EQUILIBRIUM AND THE HISTORY OF ECONOMIC THEORY, Journal of Economic Literature 36:1067-1082 (1999), revised, March 1999, accessed online on March 8th, 2006, http://home.uchicago.edu/~rmyerson/

The author wishes to thank Professor Andrew Grant of the Faculty of medicine of the University of Sherbrooke and, the CIHR Health Informatics PhD/Postdoc Strategic Training Program (Canadian Institutes of Health Research and the BC Michael Smith Foundation for Health Research) for funding and support making this article possible.