Archived Content

Information identified as archived is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.



Report Cards Presentation
ATIP Community Meeting
Josée Villeneuve, Director of Strategic Planning, Parliamentary Relations and Communications
Ottawa, Ontario
February 19, 2008


  • The Office of the Information Commissioner proactively reviews and grades the performance of government institutions in complying with the Access to Information Act. These assessments are called “report cards”.
  • Before engaging in this year’s report cards process, we assessed whether improvements could be made to the process.
  • Report Cards have had many benefits:
    • Initial dramatic reduction in the number of delay complaints.
    • Many access offices received additional funds, allowing them to be better resourced and staffed to deal with access requests within the statutory time-limits.
    • Report Cards also got the attention of the Standing Committee on Access to Information, Privacy and Ethics. This had the effect of getting senior-level commitment to monitor and improve performance.
  • The benefits initially derived from the introduction of the Report Cards are less marked as the percentage of delay complaints is on the rise again.
  • One reason may be that by focussing on deemed refusals, the previous report cards did not look at the reasons surrounding the performance. Also, the efforts put by institutions to improve their performance on a number of other elements of their access to information program were not acknowledged or disseminated.
  • This is a message we have heard many times from federal institutions, along with suggestions to expand the measures of performance to provide a more contextual analysis.
  • Our intention with this new process is to address issues that permeate the whole ATI regime and to disseminate recommendations and best practices in a way that can be implemented by a majority if not all federal institutions;
  • Is it also important for the results of the review to provide sufficient information about the challenges, weaknesses, strengths of the federal institutions under inquiry, and assess what progress has been achieved - in other words, by offering “contextual” information that will help in understanding the underlying reasons for the level of performance, good or bad.
  • We have identified five areas of improvement for the next cycle pf reviews.
    • Review period
      • The Report Cards will be based on information collected on a fiscal year basis and will be linked to the government planning cycle. This will allow those institutions subject to the performance management framework, which institutes a formal process for holding heads of institutions accountable for how they manage their institution, to be assessed within that framework.
    • Selection of institutions
      • At the present time, the choice of candidates is not based on an articulated list of criteria. The candidates have in the past been institutions with a track record of underperforming with regard to delays.
      • Since we are in a transition year, in choosing which institutions will be reviewed, we will be looking at
        • Results from last year;
        • Trends uncovered by our complaints;
        • Other issues of interest to the OIC.
      • We will also choose at least one institution with a good track record to identify good practices.
      • We will be doing 10 Report Cards this year.
    • The assessment
      • At present, Report Card grades depend on the percentage of access requests that are not answered on time. We will be looking at a wider set of benchmarks.
      • In order to close the loop with last year’s process, we will continue to measure performance against deemed refusals. However, we will also go beyond that benchmark by looking at delays caused by other issues that are more systemic in nature such as
        • The rising number of consultations with other institutions;
        • The inclusion of additional layers of approvals and their impact on delays;
      • We will also look at other areas of interest such as proactive, voluntary or informal disclosure of information.
    • Reporting
      • In following the proposed cycle, and our consultative approach, the Commissioner will not include the results of report cards in our annual report. We will table a special report as per s. 39 of the ATIA containing the Report Card analysis and recommendations.
    • Process
      • Selected institutions will be informed in March that they have been chosen for an assessment.
      • We will meet with the institutions to discuss what criteria that will be used to assess and what information will be required to conduct the review.
      • From April to May, we will be gathering information from the selected institutions. The information will cover fiscal year 07-08.
      • Over the summer months, a preliminary analysis will be provided to the institutions for discussion. This step allows for a discussion of our finding, provides an opportunity to clarify points that require clarification and to adjust the assessment as necessary.
      • There will also be time for every institution under review to prepare a response to the assessment.
      • A final Report will be tabled to Parliament at the end of October.