(1) Recommendations for Further Utilization of Country Assistance Evaluations in ODA Policymaking(2) Recommendations for More Effective Implementation of Country Assistance EvaluationsRecommendations Based on Study ResultsPlanning (Timing of Implementation of Evaluations and Selection of Targets)(c) Clarification of Objectives and Intervening Logic When Planning Evaluations(d) Management of Evaluation Surveys by the ODA Evaluation Division of MOFA(5) Timing of Evaluation Implementation and Reflection in ODA PoliciesOf the nine Country Assistance Evaluations whose results have been effectively utilized in assistance policies, a Country Development Cooperation Policy (formerly Country Assistance Policy) was formulated within two years of implementation of the Country Assistance Evaluation for four of them. Closer timing of the two would be expected to make it easier to reflect the results of Country Assistance Evaluations in assistance policies.(3) Presentation Format of Evaluation Results Consistent with the Objectives of Country Assistance EvaluationsThe desired presentation format of evaluation results varies depending on the scope of evaluation. Thus, for Country Assistance Evaluations, the presentation format of the evaluation results (including the presence or absence of ratings) should be determined after clarifying the scope of evaluation. Although the use of alphabetical or numeric ratings improves the ease of understandability, it could also prevent the subjects of evaluation from perceiving the evaluation results in a constructive manner. Accordingly, the presentation format of the evaluation should be determined with consideration for both of these factors.Annual Report on Japanʼs ODA Evaluation 202120(a) Extraction of Useful Lessons for ODA PolicymakingInformation applicable to other countries or regions should be organized and indicated in the evaluation reports as “Lessons Learned.” Accordingly, it would be desirable for the ODA Evaluation Guidelines to explain the sorts of items that should be included as lessons, just as they do for “Recommendations.”(b) Country Assistance Evaluation Implementation (a) Reconsideration of Sub-Ratings(b) Clarification of Verification Itemstwo were not set as verification items, or two to three verification items were condensed into one. For “Effectiveness of Results,” the verification method varied more by evaluator than for other evaluation criteria. Furthermore, when it came to verification beyond the level of outcomes, there were cases in which the objectives stated by assistance policy documents were not described clearly, and none of the evaluations was found to have verified the achievement of the results of Japan’s assistance against the “Objectives.” For “Appropriateness of Processes,” the number of verification items set varied greatly by evaluation, from two to nine. The number of items set as verification items was also large, totaling 22 items.Making the scope of The criteria for sub-ratings need to be defined in order to make the evaluations more convincing to readers with clearer evidence.In parallel with (a) above, the items to be verified in Country Assistance Evaluations should be organized, and standard verification items should be proposed in the Guidelines. Clarifying the verification items and indicating them in the Guidelines would make it possible to prevent variances in verification items according to the evaluator.implementation of Country Assistance Evaluations more explicit by instituting a policy of incorporating Country Assistance Evaluations into part of the revision process for Country Development Cooperation Policies would be desirable for both evaluators and evaluation users from the perspective of facilitating shared understanding. The timing of the Country Assistance Evaluation and revision of the Country Development Cooperation Policy should also be considered when selecting target countries from the viewpoint of enhancing the PDCA cycle. In some evaluation reports, the actual verification items did not match the verification items indicated in the “Evaluation Framework” established at the start of evaluation with the agreement of stakeholders.With regard to “Effectiveness of Results,” a “road map for achievement of results,” including the respective changes aimed at by the Basic Policy of Assistance (primary objective) and Priority Areas (secondary objectives) indicated in the Country Development Cooperation Policy as well as the Development Issues (tertiary objectives) indicated in the Rolling Plan, how the objectives can be accomplished, and the external factors that influence them, should be organized for the purpose of objective verification. Undergoing this process would facilitate the identification of survey items and also enable judgment of the results of information gathering and goal achievement as well as the specification of contributing and constraining factors, thereby clarifying the verification of effectiveness. In addition, proposals for realignment of the objectives could be expected ahead of the revision of the Development Cooperation Policy.When the ODA Evaluation Division, which manages evaluations, checks Country Assistance Evaluation Reports, it should make sure that the actual verification items match the verification items indicated in the evaluation framework. If the verification items were changed from the evaluation framework, the reasons for the changes should be confirmed with the evaluation team and explained in the report.
元のページ ../index.html#21