For R packages, mpn.scorecard handles both scoring and rendering. The workflow is to first score the package with score_pkg(), then to optionally generate a traceability matrix with make_traceability_matrix(), and finally to render the scorecard with render_scorecard().

render_scorecard() also supports rendering packages that are scored outside of mpn.scorecard. The scorer is responsible for preparing a results directory with the set of files described below.

Details

Input files

The following input files define results for the scorecard to render. These must reside in a directory named <package>_<version>, following the naming of the output directory returned by score_pkg().

  • <package>_<version>.pkg.json: This file provides general information about the package being scored. It requires the following keys:

    • mpn_scorecard_format: The version of the format in which these input files are specified. This should be "1.0".

    • pkg_name, pkg_version: The name and version of the package.

    • scorecard_type: The type of package. Two types are currently recognized and receive special handling: "R" and "cli". Everything else falls back to default handling.

      If you're specifying "R" here, you should probably use score_pkg() instead.

    Example:

    {
      "mpn_scorecard_format": "1.0",
      "pkg_name": "foo",
      "pkg_version": "1.2.3",
      "scorecard_type": "cli"
    }

  • <package>_<version>.check.txt: Output from the package check. This is included in the appendix verbatim.

  • <package>_<version>.coverage.json: Code coverage percentages. The values will be rounded to two decimal places when rendering. This file is optional.

    Example:

    {
      "overall": 91.54265,
      "files": [
        {
          "file": "cmd/foo.go",
          "coverage": 98.7643
        },
        {
          "file": "cmd/bar.go",
          "coverage": 84.321
        }
      ]
    }

  • <package>_<version>.scores.json: Scores for individual metrics grouped into four categories: "testing", "documentation", "maintenance", and "transparency". Each category must have a least one score.

    For the testing category, "check is required. "check" should be 1 if the tests passed and 0 if they failed. "coverage" is required if the <package>_<version>.coverage.json coverage" file exists. The value should match the "overall" value from <package>_<version>.coverage.json, divided by 100.

    Example:

    {
      "testing": {
        "check": 1,
        "coverage": 0.9154265
      },
      "documentation": {
        "has_website": 1,
        "has_news": 1
      },
      "maintenance": {
        "has_maintainer": 1,
        "news_current": 1
      },
      "transparency": {
        "has_source_control": 1,
        "has_bug_reports_url": 1
      }
    }

  • <package>_<version>.metadata.json: Information to include in the "System Info" table. The table will include the "date" and "executor" value, as well as any key-value pairs defined under "info.env_vars" and "info.sys". The "date" and "executor" keys are required.

    Example:

    {
      "date": "2024-08-01 08:19:12",
      "executor": "Bobert",
      "info": {
        "env_vars": {
          "METWORX_VERSION": "22.09"
        },
        "sys": {
          "sysname": "Linux",
          "machine": "x86_64"
        }
      }
    }

  • <package>_<version>.matrix.yaml: A file defining entries to render as the traceability matrix table. The traceability matrix table is meant to map all user-facing entry points (e.g., exported functions or available commands for a command-line executable) to the relevant documentation and test files.

    The file should consist of a sequence of entries with the following items:

    • entrypoint: The name of the entry point.

    • code: The path to where the entry point is defined.

    • doc: The path to the entry point's main documentation.

    • tests: A list of paths where the entry point is tested.

    What the entry point is called in the table depends on scorecard_type. For "cli", the column name is "Command" and, for "R", it is "Exported Function". For all other types, it is "Entry Point".

    This file is optional if the add_traceability argument of render_scorecard() is "auto" or FALSE.

    Example:

    - entrypoint: foo
      skip: true
    
    - entrypoint: foo bar
      code: cmd/bar.go
      doc: docs/commands/foo_bar.md
      tests:
        - cmd/bar_test.go
        - integration/bar_test.go