GUIDANCE NOTES FOR EVALUATORS

but without necessarily being a specialist in the specific topic of the proposal (a generalist in the field). .... In the case of remote evaluation the evaluator's login and password .... evaluation form and can download the proposals assigned. ..... A proposal involving both industry and academia may highlight the potential long.
134KB taille 5 téléchargements 294 vues
European Commission THE SIXTH FRAMEWORK PROGRAMME The Sixth Framework Programme covers Community activities in the field of research, technological development and demonstration (RTD) for the period 2002 to 2006

Structuring the European Research Area

FP6

Human Resources and Mobility

GUIDANCE NOTES FOR EVALUATORS

Participating in the evaluation of proposals for Human Resources and Mobility actions

March 2003

HRM actions

Guidance Notes for Evaluators March 2003

Contents 0.

Introduction and General Issues 0.1. 0.2.

Evaluation documentation The roles and responsibilities of evaluation participants 0.2.1. Independent experts acting as evaluators 0.2.2. Independent experts acting as rapporteurs 0.2.3 Independent experts acting as chair/vice-chairpersons 0.2.4. Independent experts acting as Independent Observers 0.2.3. Commission officials

0.3. 0.4.

Overview of the evaluation process Before the evaluation

1.

The Evaluation Process STEP 1 - Briefing of the evaluators

2.

STEP 2 - Individual evaluation of proposals 2.1. 2.2. 2.3. 2.4. 2.5. 2.6. 2.7.

3.

STEP 3 - Consensus 3.1. 3.2. 3.3.

4.

The consensus process Recommendations of 'indicators' by evaluators Recommendations for negotiation

STEP 4 - Panel evaluations 4.1. 4.2. 4.3. 4.4.

5.

Conflicts of interest and confidentiality Evaluation criteria and forms Horizontal issues to be addressed Eligibility criteria addressed during the evaluation Proposal marking Thresholds and weightings Remote evaluation

Conclusion of evaluation for proposals failing thresholds Conclusion of evaluation for proposals achieving the threshold Evaluation Summary Report (ESR) The panel report

Guidelines for the evaluation of specific evaluation criteria 5.1. 5.2. 5.3. 5.4. 5.5. 5.6.

Added Value to the Community The evaluation of Gender Issues The evaluation of Ethical Issues Evaluation of Laboratory Based Research Web Site References Monitoring Statistics

Annexes Annex I Annex II Annex III Annex IV Annex V

Individual Assessment Report (IAR) Consensus Report Ethical issues form EIR Outline panel report Specific evaluation issues relating to each HRM action

HRM actions

Guidance Notes for Evaluators March 2003

0. Introduction and General Issues 0.1. Evaluation documentation The evaluation of proposals under the Human Resources and Mobility1 (HRM) actions is based entirely on the “Guidelines on proposal evaluation and selection procedures”, that describe the general principles and the procedures which will be used in the evaluation of proposals. These guidance notes do not supersede the rules and conditions laid out, in particular, in Council and Parliament Decisions relevant to the Sixth Framework Programme, the Call for proposals or the Guidelines on proposal evaluation and project selection procedures. These guidance notes are prepared on the basis of the documents mentioned above. They describe the evaluation process in practical detail and they contain, for one HRM activity, an example set of the evaluation forms needed for the evaluation process. In addition they contain an indicative format for the panel report which each group of evaluators will prepare at the conclusion of their work. For the evaluation, you will also need to consult the relevant part of the "Structuring the European Research Area" Work Programme, which details the exclusion and evaluation criteria that will be applied to proposals. Additionally, it may be necessary to refer to the HRM Guides for Proposers. They describe in detail the contents which are required in proposals for the first HRM Calls, and tell proposers how their proposals should be prepared and submitted. These documents may be found on the specific call page for HRM actions available at: http://fp6.cordis.lu/fp6/calls.cfm Evaluation criteria and evaluation forms, and in some cases the evaluation procedures, differ according to HRM action being evaluated. For each proposal which is evaluated, ONLY THE CRITERIA, FORMS AND PROCEDURES APPROPRIATE TO THE SPECIFIC HRM ACTION MUST BE USED.

1

Please note that the guidance in this document relates to HRM actions except some specific support actions (e.g. the European Network of Mobility Centres and presidency conferences).

3

HRM actions

Guidance Notes for Evaluators March 2003

0.2. The roles and responsibilities of evaluation participants The evaluation and selection of proposals is carried out by the Commission with the assistance of independent experts, who will carry out evaluations or may also be invited by the Commission to perform the roles of rapporteur for consensus discussions and panel chairperson. 0.2.1. Independent experts acting as Evaluators Evaluators perform evaluations on a personal basis, not as representatives of their employer, their country or any other entity. They are expected to be independent, impartial and objective, and to behave throughout the evaluation process in a professional manner. They conform to the “Code of Conduct for independent experts appointed as evaluators” which is appended to the “Guidelines on proposal evaluation and project selection procedures” and must sign a confidentiality and conflict of interest declaration prior to beginning their work. These must be adhered to at all times, before, during and after the evaluation. 0.2.2. Independent experts acting as Rapporteurs A rapporteur may be appointed for each proposal in order to prepare a consensus report summarising the opinions of the evaluators associated with each proposal allocated to them, and to act, where necessary, as a moderator in reaching such a consensus. The rapporteur will normally be one of the experts who has evaluated the proposal concerned. They will be assigned to a proposal based on their knowledge of the broad scientific field but without necessarily being a specialist in the specific topic of the proposal (a generalist in the field). In such a case, there will be at least two other evaluators who are specialists in the field of the proposal. 0.2.3. Independent experts acting as Chair/Vice-chairpersons In order to assist the Commission services in the logistical management of evaluation panels one or more chair/vice-chairpersons may be appointed from the list of evaluators. Their task will typically involve the monitoring of overall progress of the evaluation and in the management of the panel meetings. 0.2.4. Independent experts acting as Independent Observers Independent experts may be appointed as observers to examine the evaluation process from the point of view of its working and execution. The role of the observers is to give independent advice to the Commission on the conduct, fairness and equity of the evaluation sessions, ways in which the procedures could be improved, the evaluation criteria used in the sessions and the way in which the evaluators apply these criteria. They do not express views on the proposals under examination or the evaluators’ opinions on the proposals. They conform to the “Code of Conduct for independent observers of the evaluation process” which is appended to the “Guidelines on proposal evaluation and project selection procedures”.

4

HRM actions

Guidance Notes for Evaluators March 2003

0.2.5. Commission officials Commission staff will organise a confidential, fair and equitable evaluation of each proposal according to the criteria applicable for this specific call, in full respect of the relevant procedures, rules and regulations. They will ensure that the process runs smoothly and fairly, that access to the information pertaining to proposals is strictly controlled and that the most efficient use is made of the time of all concerned. In consensus and panel meetings, Commission staff may act as moderators, seeking consensus between the independent experts, without any prejudice for or against particular proposals or the organisations involved. The work of an evaluator will be monitored throughout the evaluation by the Commission officials organising the evaluation. In organising the evaluation, the Commission is assisted by contracted support staff from the Evaluation Service Provider (ESP). They play no formal part in the evaluation process, but provide logistical support. Commission staff will not attempt to influence the opinion of the independent experts. Even if asked, they may not express any opinion on the merits or otherwise of any proposal. They may, however, provide additional information or assistance on request.

0.3. Before the evaluation On receipt by the Commission, proposals are opened, registered and acknowledged and their contents entered into a database to support the evaluation process. Basic exclusion criteria for each proposal are also checked by Commission staff before the evaluation begins and proposals which do not fulfil these criteria are excluded. Depending upon the number of proposals received, the evaluation may be carried out by a single group of evaluators or in different groups or sub-groups, split according to subject and/ or HRM scheme. Evaluators will be informed about the precise breakdown of any groups during the briefing session. In organising the evaluation, Commission staff assign the proposals to a panel, based on the panel requested by the applicant, where appropriate. The assignment of evaluators to a panel and the allocation of proposals to evaluators will be carried out, taking account of the fields of expertise of the experts and any relevant conflicts of interest. If the subject matter of a particular proposal covers more than one panel, appropriate means to evaluate it fairly will be established. This may involve, for example, inviting evaluators from other panels to participate in the evaluation of the proposal or forming an ad-hoc cross-cutting group of evaluators. In evaluating proposals for any of the HRM actions, proposals may be supplied to evaluators who will evaluate the proposals at their place of normal activity (i.e. remotely), under the condition that they have previously signed and returned the confidentiality and conflict of interest declaration. 5

HRM actions

Guidance Notes for Evaluators March 2003

0.4. Overview of the evaluation process The evaluation of proposals for all HRM actions is carried out using the single-stage procedure (that is the full proposal is submitted in a single stage) as described below. Each evaluation session consists of a number of steps, as described in the “Guidelines on proposal evaluation and project selection procedures” (with an accompanying flow chart). •

Step 1: Briefing of the evaluators All evaluators are briefed orally or in writing before the evaluation by representatives of the Commission’s service, in order to inform them of the general evaluation guidelines and the objectives of the relevant HRM action.



Step 2: Individual evaluation of proposals Each proposal is evaluated against the applicable criteria independently by a minimum of 3 evaluators who fill in individual evaluation forms giving marks and providing comments.



Step 3: Consensus For each proposal a consensus should be reached and a consensus report will be prepared. This report will faithfully reflect the common views of the evaluators referred to in Step 2.



Step 4: Panel meeting A panel discussion may be convened, if necessary, to examine and compare the consensus reports and marks in a given area, to review the proposals with respect to each other and to make recommendations on a priority order of proposals.

6

HRM actions

Guidance Notes for Evaluators March 2003

The Evaluation Process 1. STEP 1 - Briefing of the evaluators Evaluators will be provided with a briefing (or briefings) orally, in writing or using electronic media by Commission staff before the evaluation begins, covering the evaluation procedure and technical issues involved in the particular HRM action. The key personnel involved in the evaluation will be identified and their roles explained by the Commission staff responsible for the activity. In the case of remote evaluations the means of communication between the Commission staff and evaluators will be specified.

2. STEP 2 - Individual evaluation of proposals Each proposal will first be assessed by a minimum of three evaluators chosen from among the pool of experts taking part in the evaluation. Key aspects of the assessment are described below.

2.1. Conflicts of interest and confidentiality An expert involved in an evaluation must not have a direct or indirect conflict of interest with any of the proposals that they evaluate. An evaluator is deemed to have a direct conflict of interest when any of the following applies: they are employed by the same institution and work in collaboration with the applicant at Department level; they work closely in collaboration with the applicant; they were involved in the preparation of the proposal; or they are in some other way closely related to the applicant (family relationship) or the work of the applicant (professional relationship) so as to compromise the evaluator’s ability to impartially evaluate the proposal. In such a case the evaluator should not take part in the evaluation of such a proposal and should not attend a panel meeting where such proposals are being evaluated. An evaluator is deemed to have an indirect conflict of interest when none of the cases in the preceding paragraph applies and any of the following applies: the evaluator is employed by the same institution as the applicant; the evaluator would directly benefit from the proposal being funded or not funded in the context of their own research activities; the evaluator is involved in a contract or research collaboration with the applicant; or there is any other relationship with the proposal where the evaluator may not be able to impartially evaluate the proposal. In particular, an expert working at an institute of an extended organisation such as international research organisations or national research organisations is deemed to have an indirect link with any proposal submitted by the same organisation.

7

HRM actions

Guidance Notes for Evaluators March 2003

In such cases the evaluator may take part in an evaluation round involving such proposals but may not directly evaluate such a proposal and may not take part in discussions relating to such a proposal. If during the evaluation itself an evaluator discovers that they are in some way connected with a proposal which they have been asked to evaluate, or have some other involvement which impairs their impartiality, they must declare this immediately to the responsible Commission official(s), who will then take all necessary actions to remove the conflict of interest. Confidentiality Please note that the evaluation of proposals is a confidential process and experts will be required to sign a Conflict of Interest and Confidentiality clause before receiving proposals. In particular, when carrying out evaluations remotely, evaluators are expected to keep proposal information confidential and not release proposal or evaluation information to third parties. Nothing may be photocopied by an evaluator without specific permission from a Commission official. No documents or electronic data in any form may be taken off the evaluation premises. Phone calls to/from evaluators during the working day are not allowed at all in the reading and meeting rooms. Laptops may not be used in the evaluation space. Under no circumstance may an evaluator attempt to contact a proposer on his own account, either during the evaluation session or afterwards.

2.2. Evaluation criteria and forms The proposal will be evaluated in terms of pre-determined blocks of evaluation criteria1, as described in the Work Programme. The Work Programme also gives any threshold marks and weights which will be applied to each of the criteria. The blocks of evaluation criteria list a number of detailed issues (sub-criteria or prompting questions) which the evaluator should consider during the assessment of that block. When examining proposals, evaluators may only apply the evaluation criteria which are set out in the Work Programme and shown on the evaluation forms. At this stage the evaluators are acting individually and independently; they do not discuss the proposal with each other, nor any third party. The evaluators record their individual opinions on special forms, the Individual Assessment Report (IAR) form, giving scores and comments on the evaluation criteria, and addressing the horizontal issues as described in the Work Programme (see below). These forms detail the criteria to be used.

1

“Block of criteria” refers to the main numbered headings in the work programme annex under which several evaluation issues are grouped.

8

HRM actions

Guidance Notes for Evaluators March 2003

After completing an individual evaluation and having assigned scores for each block of criteria, the overall score will be calculated by summing the block scores after applying the appropriate weighting. As the evaluation criteria differ according to the different HRM actions, there are different versions of the Form IAR for each action. Evaluators should ensure they are using the correct version of the Form IAR for the proposal they are evaluating. Each evaluator will sign their own form. Signature on the IAR form closes this step of the evaluation. In signing the IAR, the evaluator also declares that they have no conflict of interest in evaluating the proposal. In the case of remote evaluation the evaluator’s login and password will substitute for the signature of the evaluator.

2.3. Horizontal issues to be addressed In addition to the blocks of evaluation criteria to be evaluated, a number of horizontal issues must also be evaluated and, if necessary, recommendations made. • Gender – specific guidance for the evaluation of gender issues for HRM actions is given in section 5.2 In cases where gender issues are relevant and have not been adequately addressed by the applicant, then this should be reflected in the scores given for a proposal, where relevant, or in the comments given by the evaluators (in particular in any recommendations to be taken into account at the negotiation stage). • Ethical – specific guidance for the evaluation of ethical issues is given in section 5.5 and in the relevant annex of the Guidelines on proposal evaluation and project selection procedures. If during their reading of a proposal evaluators have noted that there are ethical issues to be addressed in the proposal, they must flag this by using the box provided on the Form IAR. The issue will then be further discussed at the consensus step and, if necessary, Form EIR will be completed (see below)2.

2.4. Eligibility Criteria addressed during the evaluation Eligibility criteria defined in the Work Programme may be assessed during the evaluation process (e.g. the assessment of whether a researcher has the equivalent of four years full time research experience for certain schemes). If, after initial evaluation, it is felt that a proposal fails one of the eligibility criteria, the evaluator should draw this to the attention of the Commission representative and rapporteur. If the Commission Representative agrees with the view of the evaluators that the proposal does not fulfil one of the eligibility criteria, the evaluators, when assigning their scores for this proposal on each criterion, will assign zero scores against each of the evaluation criteria. The proposal will not be evaluated further. In cases where there is doubt over the status of the proposal with regard to eligibility criteria, the experts will proceed with the evaluation. The general comments will explain any specific issue which will then be further evaluated by the Commission services, who will decide accordingly on the eligibility status.

2

Note that, according to the Guides for Proposers, proposals should include a declaration on ethical issues where relevant.

9

HRM actions

Guidance Notes for Evaluators March 2003

2.4.1. Potential overlap with the EURATOM Programme Where a proposal is within to a research or training area or type of action where there is significant overlap with that specified in a call published under the EURATOM3 treaty, it may not be possible to fund the proposal under the Sixth RTD Framework Programme. Proposals which are known to address a research or training area which overlaps significantly with such calls should be drawn to the attention of the Commission representative. The proposal should continue to be evaluated as though it could be funded under the Sixth RTD Framework Programme and any potential eligibility issues will be taken into account by the Commission Services.

2.5. Proposal marking Evaluators examine the individual issues comprising each block of evaluation criteria and mark the blocks on a scale from 0 to 5. The integer scores below indicate the following with respect to the block under examination: 0 - the proposal fails to address the issue under examination or can not be judged against the criteria due to missing or incomplete information 1 - poor 2 - fair 3 -good 4 -very good 5 -excellent The sub-criteria or “prompts” comprising the blocks of criteria are not scored, the evaluator will only record their observations on them on the form. They are to help the evaluator support their eventual judgement on what score to assign to the criteria concerned when they have finished their reading, and also to remind them of issues they may wish to raise later during the discussions of the proposal. When assigning scores in the range 0 to 5, an evaluator should be aware of any threshold score, which may apply to each block of evaluation criteria. The integer part of the score should reflect the overall impression held by the evaluator for that proposal for the criterion being considered. The score may be given to a resolution of one-half mark, or to one decimal place (depending on the scheme), reflecting variations within a given integer score. Evaluators are required to provide a comment for each criterion being assessed. Evaluators are encouraged to give their comments in a way that clearly reflects their overall opinions and specific strengths and weaknesses of the proposal for each criterion. This will assist in the production of consensus reports later in the evaluation process. Comments should also be in a form suitable for providing feedback to proposers after the evaluation. These comments must be consistent with any scores awarded.

3

Sixth Framework Programme for EURATOM (the European Atomic Energy Community (2002-2006)(OJ 355 of 30 December 2002)

10

HRM actions

Guidance Notes for Evaluators March 2003

Guidance on marks to be awarded for a criterion: Excellent - In general evaluators should not use the score of 5 unless he or she feels that the content of the criterion being evaluated would be recognised as excellent by all evaluators within the panel and any evaluator asked to express an opinion. The evaluator should also feel that the content of the proposal could not be improved. In cases where a score approaching 5 is awarded, the evaluator should feel confident that there will be a high level of consensus from all evaluators concerned evaluating this criterion. Very Good - Scores given in the range 4.0 to 4.9 should reflect that the proposal has identifiable features which demonstrate that the proposal is of a high quality with regard to the evaluation criterion being assessed. There should be features that set the proposal apart from other good quality proposals within the evaluation. Good - Scores given in the range 3.0 to 3.9 should reflect that the proposal demonstrated overall good features with regard to the evaluation criteria (even though it may contain some notable weaknesses) or does not contain features that set it apart from many other good proposals being evaluated. Fair - A score of 2.0 to 2.9 should be awarded where the content for the criterion being evaluated is at a level consistent with that routinely produced by research establishments across Europe. There may be some strong and relevant points within the proposal but there may also be weaknesses and in particular there may be no specific details brought out which singles out the proposal from others. Evaluation comments for proposals awarded marks in this range should indicate the areas where the proposal could be improved if subsequently re-submitted. Poor - A score of 1.0 to 1.9 should be awarded if the proposal is of poor quality for the criterion being evaluated. This may be because information is incomplete in the view of the evaluator, not clear or not convincing. Evaluation comments for proposals in this category should indicate the areas where the proposal is lacking or is of poor quality and could be improved if subsequently re-submitted. A score of zero should be given for a criteria if the information detailed in the Guide for Proposals would reasonably have been expected by the evaluator and is not present in the proposal. The specific information missing should be entered in the comments' section It is not anticipated that scores will be given in the range 0.1 to 0.9.

2.6. Thresholds and weightings Threshold scores are applied to certain criteria, as described in the Work Programme, as well as on the overall score to be achieved. Proposals for which the consensus score, as verified by the panel (if convened), fails to achieve one or more of the threshold scores will not be considered for Community funding. In general, the most critical evaluation criteria have a high threshold and a low weighting (i.e. a high quality assessment for this criterion is considered to be essential). Weightings alone, therefore, do not necessarily reflect the importance of a criterion.

11

HRM actions

Guidance Notes for Evaluators March 2003

2.7. Remote Evaluation The Commission may decide to arrange for the individual evaluation and consensus steps to be carried out away from Commission-controlled premises (i.e. remotely). When remote evaluation is used for the individual reading and evaluation of proposals by individual evaluators, copies of the proposals assigned to the individual evaluator are forwarded by post (paper copies) or made available electronically via a web-based application provided by the Evaluation Service Provider. For the latter case, each evaluator receives a user identification and password providing on-line access to the system. The evaluator has unique on-line access to the individual evaluation form and can download the proposals assigned. The evaluator is encouraged to save (and not submit) scores for each proposal until he/she has assessed all proposals assigned to him/her in order that marks awarded are consistent across all proposals. Once evaluators have evaluated all of their proposals they should finalise their scores Finalisation or submission of the evaluation form closes the individual evaluation stage for that particular proposal. The evaluator will be informed about the deadlines for evaluating the proposals assigned. Please note that, for remote evaluation, deadlines for finalising initial independent evaluations, and, if relevant, consensus building, must be strictly followed. Evaluators who do not terminate their evaluation tasks within the allocated time period will be deemed not to have evaluated the proposal, and such proposals will be re-allocated to other experts. In cases where an evaluator failed to finalise their evaluation on time they will not be entitled to claim a fee for the evaluation. Any conflict of interest discovered while reading the proposal must be notified to the Commission, who will reassign the proposal to another evaluator. The Commission will maintain close contact with the remote evaluator to assist them on any query.

Practical guidelines for completing the evaluation process • Assess and mark the proposal exactly as it is described and presented. Do not make any assumptions or interpretations about the project in addition to what the proposers themselves have written in their proposal. • Keep to the evaluation criteria as stated in the forms. • Give scores and write comments for each block of criteria. • Maintain consistency in your scoring throughout your work. • Provide a brief but explicit justification for each of your scores. Be honest but correct, in particular when scores are low – you should use polite and correct language, but do not hide the facts, as your remarks may be used in the report which is sent to inform the proposers of your conclusions. It is often useful to quote short extracts from the proposal text. • Where justified give recommendations for modifications to the proposal, but ensure that the scores given reflect the proposal as presented by the applicant. • If you are using paper forms, complete your forms clearly, so that they are legible by the Commission staff, or use the IT application provided. • At the start of the evaluation, it is recommended that evaluators examine a number of proposals before “signing off” their first individual assessment forms. This will help to calibrate their scoring. 12

HRM actions

Guidance Notes for Evaluators March 2003

3. STEP 3 - Consensus 3.1. The Consensus process When all the evaluators of a particular proposal have completed their individual report forms, they will be given access to the monitoring statistics, described below in this document, to be able to compare their marking profile with that of other evaluators. They will also be allowed access to the evaluation comments and scores of other relevant evaluators who have also been assigned to that particular proposal. The evaluator may be authorised to revise their initial marks in the light of the monitoring statistics and in view of the scores awarded by other evaluators for a given proposal, if they have changed their opinion on their individual evaluation or if their own marking profile is not consistent with that of other evaluators. Any scores which are modified at this stage will not erase the initial individual scores, but will be recorded as the revised individual scores. The revised scores will be used as the scores for the evaluator in the consensus stage of the evaluation. A rapporteur will be nominated to prepare the consensus form for a given proposal and obtain approval from the other evaluators. This may happen without the need for a formal meeting, particularly in the case of remote evaluation. The rapporteur will be responsible for recording the outcome of this discussion/exchange using the appropriate Consensus Report (CR) form, and will attempt to prepare a consensus report (one for each proposal), including both comments and scores, which is acceptable to all evaluators concerned. The revision of the consensus form will continue until a common opinion on the proposal is reached, i.e. a conclusion with which all individual evaluators agree regarding the marks for each criterion and the accompanying comments. In the event of persistent disagreement the rapporteur, may: • • •

accept to record the majority view; bring in up to 3 additional evaluators to examine the proposal, or may convene a face to face meeting between evaluators and the rapporteur for the proposal.

In any circumstances where the rapporteur, in consultation with the Commission official, feels that the quality of the evaluation would be improved, additional evaluators may be selected to examine the proposal. At the end of this process, the Consensus Report will be proposed by the rapporteur and agreed by the evaluators of each proposal. If consensus has not been reached, the report sets out the majority view of the evaluators, but also records any dissenting views. In case any such persistent disagreement has not been resolved, a threshold score or the average score given by the evaluators (whichever is higher) will be awarded for the proposal (either for the relevant block or total score as appropriate). The consensus report will be passed to the panel for discussion and revision if necessary.

13

HRM actions

Guidance Notes for Evaluators March 2003

From the consensus scores for each criterion given on the CR, an overall score for the proposal will be calculated by Commission services, using the weighting scheme defined in the Work Programme. In the case of remote evaluations, after individual evaluations are finalised, the rapporteur compiles a draft consensus report electronically. The rapporteur enters his/her comments, based on the synthesis of the individual comments of the evaluators and a composite view of their scores. He/she then submits the draft report electronically to the evaluators concerned. Each expert confirms, with the electronic form, whether they agree or disagree with the consensus scores and comments for each block of criteria. When the evaluators agree on the consensus report according to the procedures set out in the Guidelines, the rapporteur submits the final version. The evaluators’ login and password will substitute for the signature of the evaluator.

3.2. Recommendation of ‘indicators’ by evaluators In addition to overall comments on the proposal the evaluators may provide, in the Consensus Report, appropriate indicators which it is felt would assist the Commission services in determining the successful implementation and execution of the proposal, if selected for funding. These indicators would be used throughout the life-cycle of the project as a means of judging the success or otherwise of the proposal.

3.3. Recommendations for Negotiation An evaluator may recommend to the Commission services that one or more detail of the proposal should be checked, monitored or amended should it be selected for funding. Examples of potential modifications are the size or duration of an application or the type of researchers being targeted by the application. Such potential modifications should be clearly indicated in the comments section for each relevant proposal as a “recommendation for negotiation”. It should be noted, however, that recommended modifications can only be accepted in justified cases and if supported by all relevant evaluators. It should also be noted that proposals must be evaluated as submitted to the Commission Services. Evaluators should not assume that any recommendation for negotiation will be successfully completed. There is one CR form per proposal. It will be signed by all the evaluators of the proposal in the consensus meeting, or, as a minimum, by the rapporteur (who will sign to indicate that he/she has recorded the consensus view of the evaluators, and a Commission official (who will sign to indicate that the evaluators recorded have been consulted). If the EIR form is used, it will be signed by all the evaluators of the proposal in the consensus group, or, as a minimum, by the rapporteur and Commission Official as described above. Signature of the CR, and, if needed, the EIR form(s) closes this step of the evaluation.

14

HRM actions

Guidance Notes for Evaluators March 2003

4. STEP 4 - Panel meetings After the consensus phase has taken place for all of the proposals submitted to a particular call /panel, the Commission may convene a panel meeting. The panel will comprise some or, if appropriate, all of the expert evaluators from the individual assessment step. Additional experts may also be invited to participate. The panel’s discussions will be moderated by a Commission official and may additionally be chaired by an expert appointed by the Commission and assisted by a Panel Rapporteur and/or Vice-Chair person. The panel, if convened, will discuss/re-examine the consensus reports for all proposals and may, in duly justified circumstances, revise the consensus scores of proposals.

4.1. Conclusion of evaluation for proposals failing thresholds Proposals for which the consensus report (after approval or review by the panel) indicates that one or more of the evaluation thresholds has not been met will no longer continue in evaluation.

4.2. Conclusion of evaluation for proposals achieving the thresholds For proposals which the panel agrees have passed all the evaluation thresholds (by block and overall) the Commission services will prepare a provisional list in order of the overall score of each proposal. This list will initially be ordered using the following priority order: I. Highest total score II. For proposals with equal total score, proposals will be ordered by: • The highest score for the criterion with the highest threshold; then, • The highest score for other criteria with a threshold in descending order of threshold; then, • The highest score for the criterion with the largest weighting; then, • The highest score for other criteria in descending order of weighting. The provisional ordering of the list will be based on the logic above unless specified in annex V. In the case of equal threshold or weighting the order is specified in the same annex. Based on the provisional ordering described above, the panel will first make an overall review of the scores and opinions given in the consensus reports on each above-threshold proposal. This serves both to bring the weight of the whole panel’s experience and expertise to the review of each proposal, and also to ensure that the same standard of scoring is applied to each. The panel may propose to revise scores or comments given. Any agreed changes in scoring or additional or revised comments from those originally given in the CR forms will be reflected in the Evaluation Summary Reports (see below).

15

HRM actions

Guidance Notes for Evaluators March 2003

Any revision of scores will generate a revised priority order. This list will be further reviewed by the panel. They will thus propose a priority list to the Commission services to consider when deciding which of the proposals to fund. For these above-threshold proposals the panel will then prepare the individual Evaluation Summary Reports (ESRs), which the Commission services will send out to each proposal co-ordinator, giving the outcome of the evaluators’ assessment of the proposal. The ESR should not contain dissenting views – the panel should resolve those cases where full consensus was not achieved at the previous step.

4.3. Evaluation Summary Reports (ESR) The Evaluation Summary Report is the document, which is returned to the proposal coordinator to give an account of the outcome of the evaluators’ assessment of the proposal. This represents the advice of the evaluators to the Commission, which the Commission will take into account in the final selection of projects for negotiation. An ESR is sent for all proposals evaluated. Co-ordinators of proposals which failed one or more eligibility criteria, and which were therefore not evaluated, receive a letter from the Commission informing them of the reasons for exclusion on eligibility grounds, rather than an ESR. If a panel has been convened, the ESR is based on the scores and conclusions initially reached in the consensus report (Form CR), supported by any relevant information from the EIR and then reviewed and discussed by the panel. In those cases where a consensus report in fact failed to reach a consensus and ended only with a majority view, the panel will come to a clear conclusion without contradictory majority/minority views, which can be conveyed to the proposers in the ESR. If no panel is convened, the comments and marks of the ESR will be the same as that of the Consensus Report (CR). Evaluators should ensure that the comments contain any recommendations that they wish to have taken into account during any possible contract negotiations. These recommendations should be as clear and specific as possible. For proposals which failed to reach the threshold on one or more of the evaluation criteria, the ESR will contain scores and comments only for those criteria fully evaluated, to clarify for the proposers the reason or reasons for the proposal’s failure, so that if possible in a later call they may submit an improved proposal. It will only contain an overall score and overall comment if the evaluation was not stopped due to a threshold failure.

4.4. The Panel report Each panel will conclude its work by preparing a Panel Report which will summarise their activities and conclusions. An indicative Panel Report format is shown as an annex to this document. The precise format of the report will be notified to the experts at the appropriate time. The report will be signed, as a minimum, by three persons, which may be the Panel Chairperson, the Panel Rapporteur (if used) or other panel members. The ESRs for all of the proposals considered by the panel will be appended to the report. 16

HRM actions

Guidance Notes for Evaluators March 2003

5. Guidelines for the evaluation of specific evaluation criteria Section 2.2 refers to the evaluation of each block of criteria, as defined in the relevant annex of the Work Programme. The following sections provide further guidance to evaluators in order that they may form a consistent view when assessing the relevant scores for a given evaluation criterion. It should be read in conjunction with the relevant annexes of the Guide for Proposers.

5.1. Added Value to the Community The Sixth Framework Programme is characterised by the inclusion of policy relevant criteria in the evaluation criteria. Evaluators are required to form a view, based on their experience of research in the European scientific community as a whole, on the relevance of the arguments put forward by the proposers which support Community policies and in particular the European Research Area. In this context, proposers are advised at the proposal preparation stage to highlight the particular Community policies that the proposal addresses, preferably with reference to Community documents or statements, and to demonstrate how the proposed application supports and addresses these policies. The proposer is supposed to address all relevant Community policies by pointing out how the specific content of the proposal addresses each one, or may focus on one or more particular policies, presenting detailed arguments on how the planned activities address the specific issues. Proposers may focus their contribution on policy objectives associated with the field of research described in the project (e.g. declared policy positions relating to the environment or social policy etc.), but all proposers are advised to look globally at the proposal and indicate where policy objectives relevant to the structuring of the European Research Area are addressed. Examples of issues addressing Added Value to the Community: • Proposers involved in Host driven actions could focus, in addition to specific policy objectives, on the role of the proposal in structuring the European Research Area with respect to the contribution to encouraging transnational mobility; in fostering the mutual recognition of qualifications or in encouraging interdisciplinary or intersectoral projects/training (raising the public awareness of the scientific discipline or enhancing the total numbers of researchers within the field or globally) • A proposal involving both industry and academia may highlight the potential long term benefits and synergies created during the term of the project which will contribute to the declared position of increasing research expenditure in Europe towards 3% of GDP (Gross Domestic Product) by 2010. • Proposers may highlight, where relevant, the impact that the proposal might have on gender issues, for example, in focussing on the attractiveness of science for women, or in positively addressing a shortfall in the number of suitably qualified women, in specific fields. In this regard a host fellowships would be expected to monitor participation rates and react as appropriate to the under representation of either gender. 17

HRM actions

Guidance Notes for Evaluators March 2003



The opening of European Research to third country nationals is an additional characteristic of the Sixth Framework Programme and any contribution that a proposal might make in fostering international relationships, expanding the pool of researchers in Europe and in enhancing the research potential of both Europe and international collaborators should be described.

Note that, if gender related issues are relevant in terms of community polices or added value, then they must be evaluated under the "added value to the community" criteria. Section 5.2. below gives further information of how gender issues are taken into account in the evaluation.

5.2. The evaluation of Gender Issues Where gender issues is a core element of the proposal Where the scientific content of a proposal is related directly to gender issues, such as, for example, as part of the subject of a project submitted to the Social, Economic and Human Sciences panel, this would be treated as any other scientific sub-discipline would be. That is to say that experts from this particular field would be invited to evaluate the proposal and assign scores against the technical content of the proposal. These scores would feature in the evaluation of the “Scientific Content of the Proposal/Project”; the “Training/Transfer of Knowledge”, and/or the “Quality /Capacity of the applicant”. The gender dimension will not figure in the evaluation of these three evaluation criteria if the proposal is in an area unrelated to gender issues. Where the participation rate of women is relevant to the proposal For host driven actions, where a host will be expected to select fellows over the course of a contract, it is a reasonable expectation that the host will monitor the participation rate of women. It is reasonable, therefore, to expect a proposal for such an activity to include a plan by which it would, if selected, actively monitor this rate against a specified objective and have a conceptual plan of how it would react in the event of under-performance (e.g. targeted promotions etc). In some cases the proposer will be expected to specifically address the issue of equal opportunity policy for the recruitment of researchers as one of the specific questions in the evaluation criterion “Management and Feasibility”. Note that for individual driven actions this criterion will not be relevant, such as an individual fellowship or excellence awards. Where gender considerations are relevant to the Community Policies or relevant to the scheme In the policy relevant criteria “Added Value to the Community” the specific issue of gender may be highlighted by a proposer. Given that there will be a number of policy issues to be evaluated it will be for the proposer to highlight specific Community policies and argue how their proposal addresses one or more of them. For example, the technical content of a proposal may address a training area or project in a particular scientific discipline (other than gender issues) and the policy element of the proposal may focus on the under-representation on women in this sector. Action may be described which would focus on redressing the balance with a goal of gender equality overall for the sector. Evaluators would score this approach based on the quality of the argument presented.

18

HRM actions

Guidance Notes for Evaluators March 2003

Alternatively a proposal may address several policy issues presenting a case for supporting each. In this way evaluators would weight the strength of argument against policy objectives described in the proposal. In either case the proposer will be expected to specifically address the impact of the proposal with respect to its potential for improving the gender balance in the scientific/training area.

5.3. The evaluation of Ethical Issues During the individual evaluation process the evaluator will be asked to identify ethical issues related to the proposal. If ethical issues are identified the evaluator will complete an Ethical Issues Report (EIR) for the proposal. This form will assist in identifying issues where the proposal should be passed for subsequent evaluation by an Ethical Review Panel. If ethical issues are identified where, at consensus stage, the evaluators agree that the ethical issues are not one of those which must be forwarded for further ethical review and that they have been adequately addressed by the proposer, the consensus report may conclude that the proposal need not be forwarded to the Ethical Review Panel. In such cases the EIR form, prepared by the rapporteur and agreed by the evaluators, will serve as a record that ethical considerations have been taken into account in the evaluation. The Commission will decide whether to conduct an ethical review of the proposal by a specialist panel at a later date, as described in the “Guidelines on proposal evaluation and project selection procedures”.

5.4. Evaluation of Laboratory Based Research In some schemes the evaluator will be asked to determine whether a research project or training programme is laboratory based or not. The effect of this evaluation is to determine the amount of financial support provided by the Commission (a greater amount is awarded in the case of a laboratory based project or training programme). For the purposes of the evaluation a project or training programme will be judged to be laboratory based if there are costs associated with the proposal beyond those of a purely theoretical study which is executed within an office based environment. Research or training activities such as field trips, expensive computer run-time, supply of chemicals or costs associated with working in a laboratory should all be considered as costs above and beyond those of a theoretical study within the scope of an office environment and the proposal would be judged to be laboratory based. The applicant, however, must clearly present information within the proposal that makes such a judgement possible. In ambiguous cases the evaluator should judge the proposal not to be laboratory based.

19

HRM actions

Guidance Notes for Evaluators March 2003

5.5. Web Site references In the Guide for Proposers, applicants were advised that “evaluators WILL NOT systematically visit referenced web pages during the evaluation, therefore, any relevant information MUST be presented in the proposal and not on associated web sites”. Consequently evaluators are not supposed to visit any web site reference contained within the proposal. The necessary information must be present in the body of the proposal if it is to be taken into account. Evaluators may, of course, refer to web sites in order to verify the legitimacy of claims within the proposal, but evaluators should not base their assessments on additional technical information found on a web site and not contained within the proposal. Evaluators should not take into account any information which has been placed on a web site which often relates directly to a proposal if it is apparent that the information has been made available after the date of the deadline for submission of proposals.

5.6. Monitoring Statistics Following the initial evaluation stage, when all initial scores have been submitted, monitoring statistics will be produced. These statistics will typically include: • The mean score for each proposal (total and block scores) for each evaluator and for the panel in general. • The distribution of total and block scores for the individual evaluator and for the panel in general. • The standard deviation for the scores given by each evaluator and the panel in general. • An indication by proposal of the relative position of each proposal evaluated by each evaluator. • An indication of whether an individual typically scores higher or lower than other evaluators evaluating the same proposal. These monitoring statistics will be a useful guide in informing evaluators whether they are consistent in their evaluations with other evaluators. It is highly recommended that experts analyse carefully the statistics provided before engaging in contact with other experts in order to appropriately take into account their own marking profile and that of the other experts before revising any score. Evaluators are advised to review their monitoring statistics prior to revising the scores given for a proposal.

20

HRM actions

Guidance Notes for Evaluators March 2003

Annexes

Annex I

Individual Assessment Report (IAR) (example) A model report form is shown here for one HRM action only. Similar evaluation forms are available for each of the HRM actions for use when evaluating the specific HRM action.

Annex II

Consensus report (CR)

Annex III Ethical issues form EIR Annex IV

Indicative panel report format

Annex V

Specific evaluation issues relating to each HRM action This annex includes the ordering used to produce the initial priority order list and any deviation from the general text in the main document.

21

HRM actions

Guidance Notes for Evaluators March 2003

Annex I: Individual Assessment Report (IAR) (example)

22

FORM IAR – Individual Assessment : EIF

Call: FP6-2002-Mobility-5

Individual Assessment Form for a Marie Curie Intra-European Fellowships Proposal No. :

Acronym :

Panel :

I. Evaluation summary Please carry out the detailed evaluation on the following pages and then enter your marks here. Scores for the evaluation criteria should reflect the quality of the proposal as submitted by the proposers.

Criterion

Mark

Weight

Score

Criterion

Mark

Weight

1. Scientific Quality of the Project

15%

5. Management and Feasibility

5%

2. Quality of Research Training

15%

6. Relevance to the Objectives of the Scheme/Activity

25%

15%

7. Added Value to Community

10%

3. Quality of the Host 4. Quality of the Researcher

15%

Total score expressed out of 5 (threshold 3.5) Total score expressed out of 100 (threshold 70%)

II. Recommendation (Note : A proposal failing any threshold can NOT be recommended for consideration)

NO p

YES p

Does this proposal have ethical issues that need further attention?

Yes o No o

Proposal recommended for consideration Overall comments (highlighting strengths and weaknesses)

(If yes you should fill in form EIR in preparation for the consensus stage)

Should this proposal be referred to the Ethical Review panel

Yes o No o

III. Declaration I declare that my evaluation of this proposal creates no conflict of interest. Name:

Signature:

23

Date:

Score

FORM IAR – Individual Assessment : EIF

Criterion 1.

Call: FP6-2002-Mobility-5

SCIENTIFIC QUALITY OF THE PROJECT

Issues to be addressed when assigning an overall mark for this criterion: Scientific/ technological quality of the project. Is the scientific content of the project important and relevant. Originality/innovative aspects. Assessment of the research method. Assessment of the interdisciplinary and multidisciplinary aspects of the proposal. Does the proposal describe the state of the art for the scientific area and the relevance of the project. Overall comments for this criterion

Overall mark (out of 5)

Note : No threshold

Criterion 2.

QUALITY OF THE RESEARCH TRAINING ACTIVITIES

Issues to be addressed when assigning an overall mark for this criterion: Clarity and quality of the research training objectives for the researchers. Complementary training and skills offered. Overall comments for this criterion

Overall mark (out of 5)

Note : the threshold is 3

0=Fails or missing/incomplete information; 1=Poor; 2=Fair; 3=Good; 4=Very good; 5=Excellent. Marks should be given to one decimal place, note the maximum is 5 (not 5.9)

24

FORM IAR – Individual Assessment : EIF

Criterion 3.

Call: FP6-2002-Mobility-5

QUALITY OF THE HOST

Issues to be addressed when assigning an overall mark for this criterion: Scientific expertise in the field. Quality of the group/supervisors. Expertise in training researchers in the field and their capacity to provide mentoring/tutoring. International collaborations. Quality of infrastructure / facilities. Overall comments for this criterion

Overall mark (out of 5)

Note : there is no threshold

Criterion 4.

QUALITY OF THE RESEARCHER

Issues to be addressed when assigning an overall mark for this criterion: Research experience. Research results. Independent thinking and leadership qualities. Potential for the development of the researchers. Suitability of skills for the project proposed. Overall comments for this criterion

Overall mark (out of 5)

Note : the threshold is 4

0=Fails or missing/incomplete information; 1=Poor; 2=Fair; 3=Good; 4=Very good; 5=Excellent. Marks should be given to one decimal place, note the maximum is 5 (not 5.9)

25

FORM IAR – Individual Assessment : EIF

Criterion 5.

Call: FP6-2002-Mobility-5

MANAGEMENT AND FEASIBILITY

Issues to be addressed when assigning an overall mark for this criterion: Practical arrangements for the implementation and management of the fellowship. Feasibility and credibility of the project. Methodological approach to the project and work plan. Overall comments for this criterion

Overall mark (out of 5)

Note : the threshold is 3

Criterion 6.

RELEVANCE TO THE OBJECTIVES OF THE ACTIVITY

Issues to be addressed when assigning an overall mark for this criterion: Benefit to the researchers from the period of advanced training/mobility. Match between project and researcher’s profile. Likeliness for the researchers to pursue the line of research after end of fellowship. Capacity of the fellowship to enhance EU scientific excellence (where appropriate).

Overall comments for this criterion

Overall mark (out of 5)

Note : there is no threshold

0=Fails or missing/incomplete information; 1=Poor; 2=Fair; 3=Good; 4=Very good; 5=Excellent. Marks should be given to one decimal place, note the maximum is 5 (not 5.9)

26

FORM IAR – Individual Assessment : EIF

Criterion 7.

Call: FP6-2002-Mobility-5

ADDED VALUE TO THE COMMUNITY

Issues to be addressed when assigning an overall mark for this criterion: Extent to which the proposed fellowship contributes towards the objectives of the European Research Area. Benefit of mobility through the transfer of knowledge and improved collaborations through the mobile researchers. Contribution to research excellence and European competitiveness.

Overall comments for this criterion

Overall mark (out of 5)

Note : there is no threshold

HORIZONTAL ISSUES TO BE ADDRESSED (but not marked): Does the researcher have, in your view, more than 4 years full time equivalent research experience?

Yes o No o

Does the researcher have, in your view, more than 10 years full time equivalent research experience?

Yes o No o

Is the research described in the proposal laboratory based?

Yes o No o

0=Fails or missing/incomplete information; 1=Poor; 2=Fair; 3=Good; 4=Very good; 5=Excellent. Marks should be given to one decimal place, note the maximum is 5 (not 5.9)

27

HRM actions

Annex II

Guidance Notes for Evaluators March 2003

Consensus report (CR)

28

FORM CR : Consensus Report EIF

Call: FP6-2002-Mobility-5

Consensus Report for a Marie-Curie Intra-European Fellowships Proposal No.:

Acronym:

Panel:

1. Scientific Quality of the Project

2. Quality of the Training Activities

3. Quality of the Host

4. Quality of the Researcher

5. Management and Feasibility

6. Relevance to the objectives of the activity

7. Added Value to the Community

Overall remarks (highlighting strengths and weaknesses):

Does this proposal have ethical issues that need further attention? If yes, fill in Form EIR Do you recommend this proposal to be reviewed by the Ethical Review Panel? Has adequate ethical approval from the country in which this research will take place been supplied in the proposal? Is the research described in the proposal laboratory based? Do you make any recommendations to be taken into account at negotiation?

Have you suggested indicators to be used to monitor the implementation of the proposal, if funded?

29

Yes o No o Yes o No o Yes o No o Yes o No o Yes o No o

Yes o No o

FORM CR : Consensus Report EIF

Call: FP6-2002-Mobility-5

Has the proposal passed all evaluation thresholds?

Evaluator names

Commission Representative’s signature :

NO

Criteria Threshold Evaluator signatures

1 -

YES

2 3 4 5 6 3 4 Non-weighted marks (out of 5)

7 -

Initial averages: Consensus marks: Weighting %

15

15

15

15

5

25

Weighted score Total expressed out of 100

30

10

Total Score

HRM actions

Guidance Notes for Evaluators March 2003

Annex III Ethical issues form EIR

31

HRM actions

Guidance Notes for Evaluators March 2003

HRM Actions

EIR Ethical issues report form

Proposal number Proposal acronym Marie Curie Action Ethical issues: You are requested to confirm whether the proposed research involves: YES

NO

UNCER TAIN

YES

NO

UNCER TAIN

Research activity aiming at human cloning for reproductive purposes Research activity intended to modify the genetic heritage of human beings which could make such changes heritable 1 Research activities intended to create human embryos solely for the purpose of research or for the purpose of stem cell procurement, including by means of somatic cell nuclear transfer Research activities involving the use of human embryos or human embryonic stem cells In the case of use of human embryonic stem cells (human ES cells) are these banked or isolated human ES cells in culture? Please indicate whether the proposal involves •

• • •



Human beings Persons not able to give consent Children Adult healthy volunteers Human biological samples Human embryonic stem cells in culture Human foetal tissue/cells Personal data or genetic information Animals (any species) Non- human primates Transgenic small laboratory animals Transgenic farm animals Cloning of farm animals Research involving developing countries (e.g. clinical trials, use of genetic resources…)

Is the research methodology justified or are alternative methods of comparable effectiveness available and which raise less ethical concerns, available? (Please comment):

?

?

yes no ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………

1

Research relating to cancer treatment of the gonads can be financed.

32

HRM actions

Guidance Notes for Evaluators March 2003

Have the proposers indicated relevant national and international legislation? (please comment) yes

?

no

?

………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… …………………………………………………………… Should this proposal be submitted to an ethical review? (proposals involving the use of human embryonic stem cell in culture, human foetal tissue , non human primates, transgenic farm animals or cloning of farm animals will systematically go to an ethical review). If yes, please indicate the main ethical issues. …………………………………………………………………………………………………….…………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… ………………………………………………………………………………………………………………… …………………………………………………………… Ethical review yes

?

no

?

Evaluator Name Date Signature

Moderators (name and signature):

………………………………………… …….……………………………………

Rapporteur (name and signature)

………………………………………… …….……………………………………

33

HRM actions

Annex IV

Guidance Notes for Evaluators March 2003

Indicative panel report format

34

HRM actions

Guidance Notes for Evaluators March 2003

Indicative panel report format

HRM Action and Panel Name Report of evaluation panel 1. INTRODUCTION AND METHODOLOGY This panel report covers the following HRM Action The table in annex gives an overview of proposals dealt with by the panel. Total proposals in panel

Ineligible

Failed threshold(s)

Proposals above threshold

2. ANALYSIS OF RECEIVED PROPOSALS Important problems encountered which are relevant to the evaluation, proposal quality, etc. 3. PROPOSAL PRIORITY The Panel recommends that a decision of the Commission on funding of proposals be based on the priority as given in the following table(s) (in annex): (table column headings are indicative and will be adapted to the specific requirements of the HRM action) Prior- Proposal Proposal ity Number Acronym

Overall Fellowmonths score requested

Country Recomme- Ethical Of Host ndations Issues from the Identified evaluators (Y/N) (Y/N)

1 2 3 Comments relating to specific issues raised during the evaluation and any considerations of the panel leading to the priority list given above. • Pay particular attention to the reasons for the choice of priority given here to proposals which have tied scores. • For proposals involving organisations from “other countries”, comment on the significance of their participation to the project. • Highlight any issues of SME participation, if relevant.

Identify proposals requiring special attention due to either the importance of ethical issues raised or the inadequacy of the way ethical issues are addressed.

5. BELOW THRESHOLD/INELIGIBLE PROPOSALS The table in annex provides the list of proposals which have not been prioritised due to exclusion or the score of at least one of the criteria falling below threshold, or the proposal falling below the overall threshold

35

HRM actions

Proposal no.

Guidance Notes for Evaluators March 2003

Acronym

Failing

6. ANNEX Attach proposal Evaluation Summary Reports

36

HRM actions

Guidance Notes for Evaluators March 2003

Annex V: Specific evaluation issues relating to each HRM action

37

HRM actions

Guidance Notes for Evaluators March 2003

Annex V: Specific evaluation issues relating to each HRM action Research Training Networks - Proposals received in response to the call will be split into seven panels depending on the evaluation panel declared by the applicant in the proposal (and on the scientific discipline of the evaluators associated with the panel). The panels are Physics, Chemistry, Life Sciences, Environment and Geosciences, Social Economic and Human Sciences, IT and Mathematics Panel and the Engineering Panel. The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Quality of the Training/ToK activities of the proposal; Scientific Quality of the project; Management and Feasibility; Quality/Capacity of the Network Partnership; Relevance to the objectives of the activity; Added value to the Community. Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

Host Fellowships for Early Stage Training - Proposals received in response to the call will be split into six panels depending on the main discipline declared by the applicant in the proposal. The panels are Physics, Chemistry, Life Sciences, Environment and Geosciences, Social Economic and Human Sciences, IT / Mathematics and Engineering Panel. The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Training activity; Content of the proposal; Management and feasibility; Quality of the host; Relevance to the objectives of the scheme; Added value to the community. Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

Host fellowships for the Transfer of Knowledge Development Scheme - Proposals will be evaluated by a single interdisciplinary panel Industry - Academia partnership scheme - Proposals will be evaluated by a single interdisciplinary panel For both schemes the provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Training activity/transfer of knowledge; Content of the proposal; Management and feasibility; Relevance to the objectives of the scheme; Added value to the community; Quality of the host. Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

Conferences and Training Courses - Proposals will be evaluated by a single interdisciplinary panel which will evaluate both Large Scale Conferences and Series of Events proposals at the same time. Large Scale Conferences and Series of Events will be listed in two separate lists.

38

HRM actions

Guidance Notes for Evaluators March 2003

The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Training activity/transfer of knowledge; Content of the proposal; Management and feasibility; Relevance to the objectives of the scheme; Added value to the community; Quality of the Host Evaluation criteria will be marked in the range 0 to 5 in increments of half marks.

Intra European Fellowships - Proposals received in response to the call will be split into six panels depending on the evaluation panel declared by the applicant in the proposal (and on the scientific discipline of the evaluators associated with the panel). The panels are Physics, Chemistry, Life Sciences, Environment and Geosciences, Social Economic and Human Sciences, IT / Mathematics and Engineering Panel. The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Quality of the Researcher; Training activity/transfer of knowledge; Relevance to the objectives of the scheme; Quality of the Host; Content of the proposal; Added value to the community; Management and feasibility. Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

International Incoming Fellowships and International Outgoing Fellowships – Proposals will be evaluated by a single interdisciplinary panel for each call. One ordered list will be produced for each of the two schemes. The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Quality of the Researcher; Training activity/transfer of knowledge; Added value to the community; Quality of the Host; Relevance to the objectives of the scheme; Content of the proposal; Management and feasibility. Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

Excellent Teams (Grants) - Proposals will be evaluated by a single interdisciplinary panel. The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Quality of the researcher; Content of the proposal; Relevance to the objectives of the scheme; Added value to the community; Management and feasibility; Quality of the Host Evaluation criteria will be marked in the range 0 to 5 in increments of half marks.

Excellence Awards - Proposals will be evaluated by a single interdisciplinary panel of experts for the criterion Quality of the Researcher. A Grand Jury will subsequently evaluate the criterion: Relevance to the Objectives of the Scheme and Added Value to the Community. Evaluation criteria will be marked in the range 0 to 5 in increments of half marks.

39

HRM actions

Guidance Notes for Evaluators March 2003

Chairs - Proposals will be evaluated by a single interdisciplinary panel. The provisional ordering of the panel list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Quality of the researcher; Training activity/transfer of knowledge; Added value to the community; Content of the proposal; Relevance to the objectives of the scheme; Quality of the Host; Management and feasibility. Evaluation criteria will be marked in the range 0 to 5 in increments of half marks.

European Return and Re-integration Fellowships - Proposals will be evaluated on a continuous basis by a pool of multidisciplinary experts. Proposals will be reviewed in a multidisciplinary panel four times per calendar year and will be batched into four selections per year. The provisional ordering of the selection list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Relevance to the objectives of the scheme; Added value to the community; Quality of the researcher; Quality of the Host; Content of the proposal; Management and feasibility; Transfer of Knowledge. Note that researchers applying for a European Return and Re-integration fellowship will be awarded the maximum mark for Added value to the community if returning to the country of nationality. Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

International Return and Re-integration Fellowships - Proposals will be evaluated on a continuous basis by a pool of multidisciplinary experts. Proposals will be reviewed in a multidisciplinary panel four times per calendar year and will be batched into four selections per year. The provisional ordering of the selection list will be based on the total score, and for proposals with equal total score, the block marks for the following criteria (in descending order): Quality of the researcher; Relevance to the objectives of the scheme; Quality of the Host; Content of the proposal; Added value to the community; Management and feasibility; Training activity/transfer of knowledge Evaluation criteria will be marked in the range 0 to 5 in increments of one decimal place.

40