The federal government’s VET FEE-HELP vocational education loan scheme was “not effectively designed or administered”, the Australian National Audit Office has found in a sharply critical report that confirms many of the criticisms of the program widely seen to have been an expensive failure.
The scheme, designed to remove barriers to accessing vocational education and increase competition among the private providers, led to a massive cost blowout. Rapidly rising course fees saw the value of loans dramatically overshoot forecast levels, increasing by 100 times in the space of a few years. Total loans under the scheme jumped from $25.6 million in 2009 to a peak of $2.9 billion in 2015.
It’s estimated that perhaps $2.2 billion of that will never be repaid, mostly because many students will never reach the income threshold to begin repayments.
A significant number never completed their course — though still incurred debt — and a lack of quality assurance means many who did finish have little to show for the debt they’ve racked up.
The scheme was established in June 2008, primarily to increase participation in vocational education and training. It was subsequently expanded in 2012 by allowing all eligible students to access a loan, and abolishing the requirement for a pathway to higher education.
Poor design and a lack of monitoring and control led to costs blowing out “even though participation forecasts were not achieved”, says the report, released Tuesday. “Insufficient protection” was given to vulnerable students from “unscrupulous private training organisations” which used predatory techniques to lure disadvantaged people into their courses, such as providing free laptops to those who signed up.
The VET FEE-HELP experience demonstrates that consumers cannot always be relied upon to keep the market competitive after price deregulation. Course tuition fees increased from an average of $4060 to $13,911 between 2009 and 2015, with the cost of the same course sometimes varying significantly between providers. For example, as at April 2016, the fees for a diploma in graphic design were $5492 with one provider and $59,860 with another; a diploma of aviation ranged from $32,330 to $96,000.
Department managed risks poorly
The auditor is critical of the Department of Education and Training, which “did not establish processes to ensure that all objectives, risks and consequences were managed in implementing the expanded scheme”. It was over-focused on growing the sector, leaving important issues such as educational quality and value to languish.
The department also failed to create an appropriate quality and accountability framework addressing identified risks, the ANAO found.
“In effect, the department’s focus on increasing participation overrode integrity and accountability considerations that would have been expected given the inherent risks. The department inadequately considered the implications of the changed incentives facing providers and students in the expanded scheme and its role in ensuring effective regulation in conjunction with other regulators — principally the Australian Skills Quality Authority and the Australian Competition and Consumer Commission,” says the report.
There was a lack of data analytics capability at the department and little internal management reporting or analysis of the scheme to identify emerging problems. The department did not develop measures to assess broader objectives of the scheme beyond growth, including those related to value and quality in the VET sector.
There was also a failure to learn from a similar experience in Victoria, ANAO noted. The recommendation contained in a regulation impact statement for a staged approach over three years did not occur, and the expanded scheme did not incorporate adequate controls over the risks identified in the statement.
Until 2015, the department had no compliance framework in place to deal with unscrupulous providers, and only “very limited and reactive” compliance activity.
There were “weaknesses” in the department’s administration of provider approvals, payment controls and managing student complaints.
ANAO argues there was also poor information flow. Payments to providers were based on data reported by the providers themselves, making it difficult for government to independently verify information about student loans. Information for students on their rights and responsibilities, as well as information on the cost, quality and reputation of VFH providers, was not easily accessible.
The government even had a “limited” level of assurance that students were aware they were entering into a loan, the auditor found.
Learning from failure
In recognition of its many problems, the scheme is being replaced on January 1 with the VET Student Loans program.
ANAO focused on what can be learned for future initiatives. According to the auditor, the key lessons from the VET FEE-HELP debacle are:
- thoughtfully considering the critical differences between a new program and any existing program on which it had been modelled, including how different incentive structures for key participants (including financial incentives) will create risks to the achievement of program objectives. Similarly, in revising an ongoing program, recognising how substantially altered incentive structures will change behaviours and risks;
- learning from comparable experiences in other agencies or jurisdictions, and carefully considering supporting program documents, such as regulation impact statements, when designing and implementing programs;
- integrating risk management principles and processes into the design, implementation and administration of a program, to effectively manage risks to the achievement of the objectives and outcomes of programs;
- placing emphasis on achieving all program objectives and outcomes, rather than excessively focussing on the prime objective (such as participation in a program). Integrity, quality and sustainability are often intrinsically linked to the primary objective and need to be achieved;
- developing key performance indicators to measure the success of the program against all key objectives and outcomes. This will help focus attention on achieving all objectives and prevent entities from overlooking key risks. Evaluating programs with a focus on understanding their impact will indicate whether the underlying policy approach is an effective intervention;
- establishing a strong data analytics capability and management reporting processes to identify emerging threats and promote understanding and visibility of the outcomes of the scheme. In demand-driven programs, modelling and sensitivity analysis should be undertaken to forecast demand, and monitoring both uptake and cost can provide early warnings of potential threats to the effective and efficient implementation of programs;
- clarifying roles and responsibilities and introducing effective mechanisms for information sharing and engagement with all entities with a role in design or implementation. Where other regulators have a role, the key implementation agency should consult with those regulators to analyse the strength of the regulatory environment and address any notable shortcomings, including by drawing these to the attention of the government as early as possible; and
- ensuring fraud, risk and compliance arrangements are operational from the commencement of a program, and reflect program risks and requirements.
Changes were made from 2015 as it became clear to the government how clearly off course the program was going. The department responded that it “has acted to address and strengthen a number of administrative processes and practices in these areas and will continue to do so through the new VET Student Loans program.”
But for the many students unable to pay back large loans for a low quality course, the boat had already sailed.
Ultimately the problems were too numerous, and the VET FEE-HELP system was scrapped. Education Minister Simon Birmingham’s new program, starting next year, is designed “to address significant issues with the operation of the previous scheme,” says the department. This includes:
- a clearer articulation that the program is designed to link training with employment outcomes;
- a new provider application process with a higher bar for entry based on track record;
- banning of brokers and curtailing the use of third party training providers;
- loan caps on eligible courses to put downward pressure on fees and protect students from rising debts;
- ensuring that payments to providers will be in arrears based on actual student numbers, requiring students to demonstrate genuine engagement in their training to continue to access their loan;
- introducing stronger powers to allow the department to rapidly address matters of compliance or poor performance.