Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • dragoon/komodo
  • a.stevan/komodo
  • c.heme/komodo
3 results
Show changes
Commits on Source (26)
  • DETCHART Jonathan's avatar
    Bump to 1.0.1 (dragoon/komodo!173) · 020b9213
    DETCHART Jonathan authored
    020b9213
  • STEVAN Antoine's avatar
    ignore some IDE files (dragoon/komodo!174) · 13af3b28
    STEVAN Antoine authored and DETCHART Jonathan's avatar DETCHART Jonathan committed
    13af3b28
  • DETCHART Jonathan's avatar
    36714bef
  • STEVAN Antoine's avatar
    add badges for the latest release and the CIs (dragoon/komodo!178) · 385270c0
    STEVAN Antoine authored
    this adds three badges to the README:
    - the latest release (links to the [release page](https://gitlab.isae-supaero.fr/dragoon/komodo/-/releases))
    - the GitLab CI status (links to the [GitLab pipeline dashboard](https://gitlab.isae-supaero.fr/dragoon/komodo/-/pipelines))
    - the GitHub CI status (links to the [GitHub workflow runs dashboard](https://github.com/dragoon-rs/komodo/actions))
    385270c0
  • STEVAN Antoine's avatar
    add badges for crates.io and docs.rs (dragoon/komodo!179) · f711147d
    STEVAN Antoine authored
    this MR adds two new badges to the README
    - the crate (links to the [Komodo crate](https://crates.io/crates/komodo))
    - the documentation (links to the [doc](https://docs.rs/komodo/latest/komodo/))
    f711147d
  • STEVAN Antoine's avatar
  • STEVAN Antoine's avatar
    add FRI (dragoon/komodo!175) · 202d8bcc
    STEVAN Antoine authored
    FRI protocol from [`dragoon/fri`](https://gitlab.isae-supaero.fr/dragoon/fri)
    
    ## changelog
    - add binary assets to be used as inputs
    - add `fri` and `fri_test_utils` as local dependencies until [`dragoon/fri`](https://gitlab.isae-supaero.fr/dragoon/fri) becomes public
    - add `fri` feature and module (see section below for the public definitions)
    - fix bug in 32bd6566
    - bump Rust in e7a2c244
    - add a versatile example
    - add Nushell pipeline to run benchmarks and plot results
    - add some tests
    
    ## `fri` module API
    ```rust
    struct Block<F: PrimeField, H: Hasher>
    ```
    
    ```rust
    fn evaluate<F: PrimeField>(bytes: &[u8], k: usize, n: usize) -> Vec<Vec<F>>
    ```
    
    ```rust
    fn encode<F: PrimeField>(
        bytes: &[u8],
        evaluations: Vec<Vec<F>>,
        k: usize,
    ) -> Vec<fec::Shard<F>>
    ```
    
    ```rust
    fn prove<const N: usize, F: PrimeField, H: Hasher, P>(
        evaluations: Vec<Vec<F>>,
        shards: Vec<fec::Shard<F>>,
        blowup_factor: usize,
        remainder_plus_one: usize,
        nb_queries: usize,
    ) -> Result<Vec<Block<F, H>>, KomodoError>
    where
        P: DenseUVPolynomial<F>,
        for<'a, 'b> &'a P: Div<&'b P, Output = P>,
        <H as rs_merkle::Hasher>::Hash: AsRef<[u8]>,
    ```
    
    ```rust
    fn verify<const N: usize, F: PrimeField, H: Hasher, P>(
        block: Block<F, H>,
        domain_size: usize,
        nb_queries: usize,
    ) -> Result<(), KomodoError>
    where
        P: DenseUVPolynomial<F>,
        for<'a, 'b> &'a P: Div<&'b P, Output = P>,
        <H as rs_merkle::Hasher>::Hash: AsRef<[u8]>,
    ```
    
    ```rust
    fn decode<F: PrimeField, H: Hasher>(blocks: Vec<Block<F, H>>, n: usize) -> Vec<u8>
    ```
    
    ## results
    
    ### times
    
    ![evaluating](/uploads/69607a2f987e26c23dd172d469c682c5/evaluating.png)
    ![encoding](/uploads/540ac15c21ba7500ad34b068c5d9d7dc/encoding.png)
    ![proving](/uploads/a694525c7d1277fe0b53dd87b6443900/proving.png)
    ![verifying_single](/uploads/8f03a3a0abca329eea396f3ba8b76512/verifying_single.png)
    ![decoding](/uploads/ba2cb0aa54f2ecff16340333121f16ca/decoding.png)
    
    ### sizes
    
    ![commits_single](/uploads/59a96661482fb1d918efc098e060cd45/commits_single.png)
    ![commits_single_normalized](/uploads/11398ed3f37ab4917b717cb717c9070d/commits_single_normalized.png)
    ![proofs](/uploads/17da07f4ef4ee637236deba7835cc022/proofs.png)
    ![proofs_normalized](/uploads/b2aae9491c56767ad1bf5674cf980361/proofs_normalized.png)
    202d8bcc
  • DETCHART Jonathan's avatar
    Add fft and interpolation method for RS coding benchmarks (!185) · 702cd5eb
    DETCHART Jonathan authored and STEVAN Antoine's avatar STEVAN Antoine committed
    This MR adds an option to perform erasure coding using FFT rather than using a matrix. It also adds the field FP128 in the list of curves
    
    Note that there is a redundant function `random_loss` into benchmarks/bin/fec.rs and examples/fec.rs
    702cd5eb
  • STEVAN Antoine's avatar
    check more args in the benchmarks (dragoon/komodo!187) · 57a96c02
    STEVAN Antoine authored
    this replaces the "_nothing to do_" messages from Nushell commands with more explicite errors.
    57a96c02
  • STEVAN Antoine's avatar
    update benchmarks readme (dragoon/komodo!176) · 07602875
    STEVAN Antoine authored
    ## changelog
    - add more snippets and instructions
    - add missing imports
    - update the "atomic operations" section
    - add a table of contents
    07602875
  • STEVAN Antoine's avatar
    check Nushell files in the CI (dragoon/komodo!137) · f2a76fbb
    STEVAN Antoine authored
    in order to help catch issues with the various Nushell scripts and modules in the source base, i propose to add a CI script that will check them all.
    
    below is an example error, e.g. when introducing a syntax error in `.env.nu`:
    ```
    Error:   × Failed to parse content:
      │     file: .env.nu
      │     err:  Expected keyword.
    ```
    f2a76fbb
  • STEVAN Antoine's avatar
    add rust-analyzer to the toolchain (dragoon/komodo!188) · 11a5b59d
    STEVAN Antoine authored
    this should avoid the following error when trying to rust the LSP in a toolchain which does not have the `rust-analyzer` component and without running `rustup component add rust-analyzer` manually:
    ```
    error: Unknown binary 'rust-analyzer' in official toolchain 'stable-x86_64-unknown-linux-gnu'.
    ```
    11a5b59d
  • STEVAN Antoine's avatar
    fix typos and improve algebra/semi_avid tests (!189) · cc412625
    STEVAN Antoine authored
    mainly fixes a few typos and improves `algebra` and `semi_avid` tests
    
    ## changelog
    - fix some typos and notes in the documentation
    - test more cases for `algebra::split_data_into_field_elements` and `algebra::merge_elements_into_bytes`, more data lengths and more modulus respectively
    - remove a useless `::<Vec<_>>` on a `collect` in `fec`
    - `semi_avid::tests`
      - refactor `bls12-381` into a constant
      - write an "_attack_" function to alter a particular block, `attack<F, G>(block: Block<F, G>, c: usize, base: u128, pow: u64) -> Block<F, G>`
      - pass a list of attacks, i.e. `attacks: Vec<(usize, usize, u128, u64)>`, to `verify_with_errors_template`, use the same as the previous hardcoded one
      - pass a list of "_recodings_", i.e. `recodings: Vec<Vec<usize>>`, to `verify_recoding_template` and assert the number of blocks, e.g. `vec![vec![2, 3], vec![3, 5]]` means that recoding 2 and 3 together should verify, and same with 3 and 5, and also that there must be at least 6 blocks
      - pass a list of "_recodings_", i.e. `recodings: Vec<(Vec<Vec<usize>>, bool)>`, to `end_to_end_with_recoding_template` and assert the number of blocks and the number of source shards, e.g. `vec![(vec![vec![0, 1], vec![2], vec![3]], true)]` means that trying to decode with blocks 2, 3 and a recoded block from 0 and 1 should work, and also that there must be at least 4 blocks and at most 3 source shards
      - pass `k` and `n` to `run_template`
    cc412625
  • STEVAN Antoine's avatar
    refactor CI and Makefile for Nushell installation (!192) · 5511404f
    STEVAN Antoine authored
    the idea is to make the installation of Nushell easier to maintain, especially regarding versions, currently pinned to `0.95.0`
    
    > successful run on GitHub: [13032726635](https://github.com/dragoon-rs/komodo/actions/runs/13032726635)
    
    this is also to allow easier testing locally with the same Nushell version as in the CI, e.g.
    ```bash
    # install in `~/.local/bin/` and have multiple versions
    make install-nu
    
    hash=$(/tmp/nu --no-config-file --commands 'version | get commit_hash')
    nu_bin=$"$HOME/.local/bin/nu-$hash"
    cp /tmp/nu $nu_bin
    
    make NU=$nu_bin show test
    ```
    or
    ```bash
    # install in the repo and overwrite each time
    make NU_DEST=. install-nu
    make NU=./nu show test
    ```
    
    # changelog
    - Makefile
      - split the global `.PHONY` rule into _atomic_ rules next to each _phony_ rule
      - define `NU` and `NU_FLAGS` to allow changing which and how Nushell runs
      - define `NU_ARCH`, `NU_VERSION`, `NU_BUILD` and `NU_DEST` for Nushell installation
      - tweak the output of `make show` a bit
      - add `print-%` rules to print `Makefile` variables, e.g. `make print-NU_FLAGS` would print `--no-config-file`
      - add `make install-nu` to replace the ones from the CIs
    - GitLab CI
      - use `make install-nu`
      - export `PATH` with `make print-NU_DEST`
    - GitHub CI 
      - use `make install-nu` instead of [github.com:hustcer/setup-nu](https://github.com/hustcer/setup-nu)
      - export `PATH` with `make print-NU_DEST` and `GITHUB_ENV`
    5511404f
  • STEVAN Antoine's avatar
    rework benchmarks: script and output format (dragoon/komodo!193) · ab0d9a8b
    STEVAN Antoine authored
    this is an attempt at making benchmarks easier to work with
    - `benchmarks run` will read benchmarks from NUON data and run them
    - `benchmarks plot` will plot benchmark results
    - the output format will be something 
      - a directory whose name is the hash of the CPU spec and the Komodo commit hash
      - contains `cpu.json` with the CPU info
      - contains `komodo.txt` with the Komodo commit hash
      - contains NDJSON result files
    
    > 💡 **Note**
    >
    > results will typically be uploaded to https://gitlab.isae-supaero.fr/dragoon/komodo-benchmark-results
    
    > 💡 **Note**
    >
    > this MR goes alongside the [`komodo-benchmark-results@restart`](https://gitlab.isae-supaero.fr/dragoon/komodo-benchmark-results/-/compare/main...restart) branch
    
    ## changelog
    - bump Nushell to 0.101.0
      - parallel `$in` => remove useless `let input = $in` when possible, e.g. still required when using the function's `$in` in a `for` loop
      - `group-by` changed => `group-by x --to-table` will now produce a table with columns `x` and `items` instead of `group` and `items` as in 0.95.0
    - add link to results repo
    - the Nushell benchmarks lib
      - rename `--force` to `--no-confirm (-y)`
      - add `--append` 
      - reject columns that GPLT will complain about, e.g. `$.points.k` for the FRI plots
      - add `--save` to the FRI plot
    - move the "field" and "curve group" benchmarks from `benchmarks/src/bin/operations/` to `benchmarks/src/bin/`
    - remove `benchmarks/params/fri.nu` because it's been uniformized with the other methods
    - rewrite the README
    - add main function to `benchmarks/` that runs the benchmarks from a NUON record specification
    - simplify the output of FRI run
    
    ## TODO
    - [x] fix "_atomic operations_" (done in 4f69a1d6)
    - [x] check that _plotting_ still works
    
    ## images
    
    ### Field
    ![complex_curve_group_operations](/uploads/57b36926cce041cf405a9b44f190b8b8/complex_curve_group_operations.png)
    ![complex_field_operations](/uploads/0747c85dbaff8980561aa9d922fcd5e7/complex_field_operations.png)
    ![simple_curve_group_operations](/uploads/974cf70fed68f8d8ac898d54be3f27be/simple_curve_group_operations.png)
    ![simple_field_operations](/uploads/a3d4d0dcdeb35d4c434eaa38fb51e7b5/simple_field_operations.png)
    
    ### Linear algebra
    ![linalg-inverse](/uploads/bc290ffa39459ce0f9bbd393b50b7b98/linalg-inverse.png)
    ![linalg-mul](/uploads/96d8c2a63ed48d6a0d3508b4a948153b/linalg-mul.png)
    ![linalg-transpose](/uploads/128e35eca91497d8aadb0130c05aeee3/linalg-transpose.png)
    
    ### FEC
    ![encoding](/uploads/405c4d3ef9ec5135ebdd7ce2e6c96bfe/encoding.png)
    ![decoding](/uploads/d793234d44e9fc6f34f0c2a9372863cd/decoding.png)
    ![recoding](/uploads/413021de997c86d45b1287fcfe7804c7/recoding.png)
    ![combined](/uploads/6d1c3ae6d3bf5547434ca29ae80b5536/combined.png)
    ![ratio](/uploads/6cebd7a0bcef57d1b1256bc3941c1b0a/ratio.png)
    
    ### ZK
    ![setup](/uploads/1feb169452aa3274dc924edf772c9a5b/setup.png)
    ![commit](/uploads/0a2775c2116ca7d3fa7b956b934d1565/commit.png)
    
    ### FRI
    ![commits_single](/uploads/4602725e551a025d42815183a61d11b2/commits_single.png)
    ![commits_single_normalized](/uploads/2ffebea940af1cbbe9bd64499343e0e9/commits_single_normalized.png)
    ![end_to_end](/uploads/46917abd2f5976dfa6d3039cc0ab2c0e/end_to_end.png)
    ![evaluating](/uploads/cf4dc496cd5144615bf5f9b06d27dccd/evaluating.png)
    ![proofs](/uploads/b0828bfa62c2226c8d63c9ecd387049f/proofs.png)
    ![proofs_normalized](/uploads/8fbf0d713884e2e147c62313a001f379/proofs_normalized.png)
    ![proving](/uploads/161539dc412330be1878cdda82c3d966/proving.png)
    ![verifying_single](/uploads/2df7e777481d7789478386f8e83e0783/verifying_single.png)
    ab0d9a8b
  • STEVAN Antoine's avatar
    fix the Rust toolchain in the CI (!195) · 3c8afac4
    STEVAN Antoine authored
    this is to avoid having to change the toolchain in the CI everytime.
    
    `rust-toolchain.toml` and `.gitlab-ci.yml` will have to be updated in pairs when bumping / changing the Rust toolchain.
    3c8afac4
  • HEME Clement's avatar
    Added Latex support for doc (!197) · b5381fda
    HEME Clement authored and STEVAN Antoine's avatar STEVAN Antoine committed
    Added a html header files, an added metadata line in Cargo.toml and some formatting changes in the doc to have Latex formulas in docs.rs. It works locally, should work online
    b5381fda
  • STEVAN Antoine's avatar
    add a script to check the GitHub mirror (!199) · b7e2d99d
    STEVAN Antoine authored
    ## usage
    ```bash
    nu scripts/check-mirror.nu ...[
        https://gitlab.isae-supaero.fr/dragoon/komodo
        https://github.com/dragoon-rs/komodo
        main
    ]
    ```
    will
    - check the Nushell version and throw a warning if mismatch
    - add temporary remotes with random names
    - fetch the temporary remotes
    - get the revision of the `$branch` for both temporary remotes
    - clean the temporary remotes
    - show a message if the two revisions are not the same and how many commits they are different
    
    > **Note**
    >
    > this does not conflict with #17
    b7e2d99d
  • STEVAN Antoine's avatar
    remove nushell from ci (!200) · 0d7a5cd6
    STEVAN Antoine authored
    this should close #17.
    
    SACLIN has been moved to [`gitlab.isae-supaero.fr:dragoon/komodo.nu`](https://gitlab.isae-supaero.fr/dragoon/komodo.nu).
    0d7a5cd6
  • STEVAN Antoine's avatar
    use gh-cli to pull mirror run information (!201) · 1519af49
    STEVAN Antoine authored
    This is to complete the `check-mirror.nu` script with information about the CI runs.
    
    The current output of
    ```bash
    nu scripts/check-mirror.nu ...[
        https://gitlab.isae-supaero.fr/dragoon/komodo
        https://github.com/dragoon-rs/komodo
        main
    ]
    ```
    is
    ```
    [INF] adding remotes
    [INF] fetching
    [INF] cleaning
    [ OK] mirror is up to date
    [INF] pulling mirror runs
    #─┬────id─────┬────────────────head_sha────────────────┬─status──┬conclusion┬run_started_at
    0 │11950217125│de4266c0│completed│success   │4 months ago
    1 │11950604920│202d8bcc│completed│success   │4 months ago
    2 │11950873603│702cd5eb│completed│success   │4 months ago
    3 │12028360471│57a96c02│completed│success   │4 months ago
    4 │12028432364│07602875│completed│success   │4 months ago
    5 │12028781541│f2a76fbb│completed│success   │4 months ago
    6 │12631165433│11a5b59d│completed│success   │2 months ago
    7 │13008608156│cc412625│completed│success   │2 months ago
    8 │13033206333│5511404f│completed│success   │2 months ago
    9 │13113244065│ab0d9a8b│completed│success   │2 months ago
    10│13679188387│3c8afac4│completed│success   │3 weeks ago
    11│14191103891│b5381fda│completed│success   │a day ago
    12│14214853530│b7e2d99d│completed│success   │an hour ago
    13│14215431531│0d7a5cd6│completed│success   │32 minutes ago
    ──┴───────────┴────────────────────────────────────────┴─────────┴──────────┴──────────────
    ```
    1519af49
  • STEVAN Antoine's avatar
    use `gitlab.isae-supaero.fr:a.stevan/nob.rs` to build (!202) · 4962757c
    STEVAN Antoine authored
    that's an attempt at using Rust to build itself.
    
    this is using [`gitlab.isae-supaero.fr:a.stevan/nob.rs@e4b03cdd`](a.stevan/nob.rs@e4b03cdd).
    
    > 💡 **Note**
    >
    > to be honest, this is not a 100% replacement of the `Makefile`...
    >
    > `make.rs` does a lot more and provides a full CLI with easy-to-use options, e.g. instead of `make fmt` and `make fmt-check`, we now have `./make.rs fmt` and `./make.rs fmt --check`
    >
    > (see the API below)
    
    ## the API
    ```
    Usage: make [OPTIONS] [COMMAND]
    
    Commands:
      fmt      Formats the code
      check    Checks the code
      clippy   Runs Clippy
      test     Runs the tests
      version  Shows the version of all the tools used,
      doc      Builds the documentation
      help     Print this message or the help of the given subcommand(s)
    
    Options:
      -h, --help      Print help
      -V, --version   Print version
    ```
    ```
    Usage: make fmt [OPTIONS]
    
    Options:
      -c, --check  Only checks instead of really formatting
    ```
    ```
    Usage: make check
    ```
    ```
    Usage: make clippy
    ```
    ```
    Usage: make test [OPTIONS]
    
    Options:
      -v, --verbose   Be extra verbose with the output of the tests
      -e, --examples  Run the examples instead of regular tests
    ```
    ```
    Usage: make version
    ```
    ```
    Usage: make doc [OPTIONS]
    
    Options:
      -o, --open      Open the documentation in the browser
      -p, --private   Document private items
      -f, --features  Document all features
    ```
    
    ## running the pipeline in the GitHub mirror
    ```bash
    const GH_API_OPTIONS = [
        -H "Accept: application/vnd.github+json"
        -H "X-GitHub-Api-Version: 2022-11-28"
    ]
    let res = gh api ...$GH_API_OPTIONS /repos/dragoon-rs/komodo/actions/runs | from json
    ```
    ```bash
    let runs = $res.workflow_runs
        | where head_branch == "use-nob-to-build"
        | select id head_sha status conclusion run_started_at
        | into datetime run_started_at
        | sort-by run_started_at
    ```
    ```bash
    $runs
        | update id { $"[`($in)`]\(https://github.com/($GITHUB_MIRROR)/actions/runs/($in)\)" }
        | update run_started_at { format date "%Y-%m-%dT%H:%M:%S" }
        | to md --pretty
    ```
    | id                                                                             | head_sha                                 | status    | conclusion | run_started_at      |
    | ------------------------------------------------------------------------------ | ---------------------------------------- | --------- | ---------- | ------------------- |
    | [`14237650542`](https://github.com/dragoon-rs/komodo/actions/runs/14237650542) | d67f1cfd | completed | success    | 2025-04-03T07:44:14 |
    | [`14237741570`](https://github.com/dragoon-rs/komodo/actions/runs/14237741570) | 9ef598a1 | completed | success    | 2025-04-03T07:49:40 |
    | [`14238086977`](https://github.com/dragoon-rs/komodo/actions/runs/14238086977) | 0a79edf3 | completed | success    | 2025-04-03T08:09:13 |
    | [`14238175174`](https://github.com/dragoon-rs/komodo/actions/runs/14238175174) | a84b2b12 | completed | success    | 2025-04-03T08:13:52 |
    | [`14239395984`](https://github.com/dragoon-rs/komodo/actions/runs/14239395984) | 8594c9bf | completed | success    | 2025-04-03T09:16:00 |
    4962757c
  • STEVAN Antoine's avatar
    remove `check-nushell-files` (!204) · 4b53a261
    STEVAN Antoine authored
    this removes the `check-nushell-files.nu` script i left in !200.
    4b53a261
  • STEVAN Antoine's avatar
    run the pipeline on any push in the GitHub mirror (!203) · c462679e
    STEVAN Antoine authored
    This should close #19.
    
    ## changelog
    - triggers pipeline in the GitHub mirror on each push
    - add `./scripts/get-mirror-pipelines.nu`
    
    ## the pipelines
    >  **Important**
    >
    > i just had to add `run-mirror-pipeline-everytime` to the list of protected branches and trigger the push with an empty commit and it runs
    >
    > and then a single call to `nu scripts/get-mirror-pipelines.nu "run-mirror-pipeline-everytime"` gives the table below
    
    | id                                                                             | head_sha                                 | status    | conclusion | run_started_at      |
    | ------------------------------------------------------------------------------ | ---------------------------------------- | --------- | ---------- | ------------------- |
    | [`14238533729`](https://github.com/dragoon-rs/komodo/actions/runs/14238533729) | 2f32ee59 | completed | success    | 2025-04...
    c462679e
  • STEVAN Antoine's avatar
    build the pipelines image from Dockerfile (!205) · 34062030
    STEVAN Antoine authored
    This adds a Dockerfile to build an image for the pipelines, once and for all, until we decide to bump the Rust toolchain.
    
    ## changelog
    - add `.gitlab-ci.dockerfile` which
      - uses `rust:latest` as a base (it appears using `alpine:latest` is both bigger and slower 🤷)
      - installs basic system dependencies
      - uses `rust-toolchain.toml` to install the toolchain of the project
      - installs `cargo-script`
    - use [`gitlab-registry.isae-supaero.fr/dragoon/komodo:bcb0e6b5`](https://gitlab.isae-supaero.fr/dragoon/komodo/container_registry/42) in the GitLab pipelines and [`ghcr.io/dragoon-rs/dragoon/komodo:bcb0e6b5`](https://github.com/orgs/dragoon-rs/packages/container/dragoon%2Fkomodo/388672772?tag=bcb0e6b5f73420762f6208700a43291e0066c2c3) in the GitHub pipelines and remove the "manual" dependency installation
    - add `./make.rs container`, `./make.rs container --login` and `./make.rs container --push`
      - uses `nob.rs@7ea6be8` to capture output of...
    34062030
  • STEVAN Antoine's avatar
    add `container list` for Docker containers (!207) · 38fe0ff4
    STEVAN Antoine authored
    ## examples
    ```bash
    ./make.rs container list --json
        | from ndjson
        | into datetime CreatedAt
        | into filesize Size VirtualSize
        | reject CreatedSince
    ```
    or
    ```bash
    ./make.rs container list --json
        | from ndjson
        | into datetime CreatedAt
        | into filesize Size VirtualSize
        | reject CreatedSince
        | select ID Repository Tag CreatedAt VirtualSize
        | update Tag { str substring 0..<7 }
    ```
    
    ## changelog
    - transform options of `container` to sub-subcommands
      - `container` --> `container build`
      - `container --login` --> `container login`
      - `container --push` --> `container push`
    - add `container list` to print the local images for the GitLab and GitHub repositories
      - `container list` will print in a pretty table
      - `container list --json` will print as NDJSON, i.e. one image per line as JSON
    - use wrappers around `nob::run_cmd_as_vec_and_fail!`
      - `extend_and_run` to run a partial command with an extra vector ...
    38fe0ff4
  • STEVAN Antoine's avatar
    update the readme (!206) · 6ebc8bbd
    STEVAN Antoine authored
    This MR explains the use of `make.rs` and `nob.rs` in the README for newcomers.
    6ebc8bbd
Showing with 776 additions and 200 deletions
FROM rust:latest
# Suppress prompts during package installation
ENV DEBIAN_FRONTEND=noninteractive
RUN apt update --yes && apt upgrade --yes
RUN apt install --yes protobuf-compiler
COPY rust-toolchain.toml /
RUN rustup show && cargo --version
RUN cargo install cargo-script
RUN apt clean && rm -rf /var/lib/apt/lists/*
name: Rust CI
on: [push, pull_request, workflow_dispatch]
jobs:
fmt:
runs-on: ubuntu-latest
container:
image: "ghcr.io/dragoon-rs/dragoon/komodo:bcb0e6b5f73420762f6208700a43291e0066c2c3"
if: "!contains(github.event.head_commit.message, 'draft:') && !contains(github.event.head_commit.message, 'no-ci:')"
steps:
- uses: actions/checkout@v3
- name: Run fmt check
run: |
./make.rs fmt --check
test:
runs-on: ubuntu-latest
container:
image: "ghcr.io/dragoon-rs/dragoon/komodo:bcb0e6b5f73420762f6208700a43291e0066c2c3"
needs: fmt
if: "!contains(github.event.head_commit.message, 'draft:') && !contains(github.event.head_commit.message, 'no-ci:')"
steps:
- uses: actions/checkout@v3
- name: Show configuration
run: |
./make.rs version
- name: Run tests
run: |
./make.rs check
./make.rs clippy
./make.rs test
# Rust
target/
Cargo.lock
*.ndjson
*.png
# IDEs
.idea
.vscode
image: "rust:latest"
image: "gitlab-registry.isae-supaero.fr/dragoon/komodo:bcb0e6b5f73420762f6208700a43291e0066c2c3"
stages:
- fmt
- test
variables:
NUSHELL_ARCH: "x86_64-unknown-linux-musl"
NUSHELL_VERSION: "0.95.0"
workflow:
rules:
- if: $CI_COMMIT_MESSAGE =~ /^(draft|no-ci):/
......@@ -19,28 +15,15 @@ workflow:
fmt:
stage: fmt
script:
- make fmt-check
- ./make.rs fmt --check
test:
stage: test
needs:
- fmt
before_script:
- apt update --yes
- apt upgrade --yes
- apt install protobuf-compiler --yes
- export NUSHELL_BUILD="nu-$NUSHELL_VERSION-$NUSHELL_ARCH"
- export PATH="/tmp/:$PATH"
# install Nushell
- curl -fLo /tmp/nu.tar.gz "https://github.com/nushell/nushell/releases/download/$NUSHELL_VERSION/$NUSHELL_BUILD.tar.gz"
- tar xvf /tmp/nu.tar.gz --directory /tmp
- cp "/tmp/$NUSHELL_BUILD/nu" /tmp/nu
- make show
script:
- make check clippy test example
- ./make.rs version
- ./make.rs check
- ./make.rs clippy
- ./make.rs test
REVISION: 1aa2ed1947a0b891398558fcf4e4289849cc5a1d
VERSION: 0.102.0
[package]
name = "komodo"
version = "1.0.0"
version = "1.0.1"
edition = "2021"
description = "Komodo: cryptographically-proven erasure coding for distributed systems"
repository = "https://gitlab.isae-supaero.fr/dragoon/komodo"
......@@ -25,26 +25,31 @@ thiserror = "1.0.50"
tracing = "0.1.40"
tracing-subscriber = "0.3.17"
ark-poly-commit = { git = "https://gitlab.isae-supaero.fr/a.stevan/poly-commit", version = "0.4.0", rev = "19fc0d4", optional = true }
dragoonfri = { version = "0.1.0", optional = true}
[workspace]
members = [
"benchmarks",
"bins/rank",
"bins/saclin",
]
[dev-dependencies]
ark-bls12-381 = "0.4.0"
clap = { version = "4.5.17", features = ["derive"] }
itertools = "0.13.0"
rand = "0.8.5"
dragoonfri-test-utils = "0.1.0"
hex = "0.4.3"
[features]
kzg = ["dep:ark-poly-commit"]
aplonk = ["dep:ark-poly-commit"]
fri = ["dep:dragoonfri"]
fs = []
[package.metadata.docs.rs]
features = ["kzg", "aplonk"]
rustdoc-args = [ "--html-in-header", "katex.html" ]
[[example]]
name = "kzg"
......@@ -53,3 +58,11 @@ required-features = ["kzg"]
[[example]]
name = "aplonk"
required-features = ["aplonk"]
[[example]]
name = "fri"
required-features = ["fri"]
[[example]]
name = "fec"
required-features = ["fri"]
.PHONY: fmt fmt-check check clippy test-rs test-nu test example show doc build-examples
DEFAULT_GOAL: fmt-check check clippy test-rs
fmt-check:
cargo fmt --all -- --check
fmt:
cargo fmt --all
check:
cargo check --workspace --all-targets
cargo check --workspace --all-targets --features kzg
cargo check --workspace --all-targets --features aplonk
cargo check --workspace --all-targets --all-features
clippy:
cargo clippy --workspace --all-targets --all-features -- -D warnings
test-rs:
cargo test --workspace --verbose --all-features
cargo test --examples --verbose
test-nu:
nu bins/saclin/tests/cli.nu
nu bins/saclin/tests/binary.nu
test: test-rs test-nu
example:
nu bins/saclin/examples/cli.nu
show:
rustup --version
rustup show --verbose
rustc --version
cargo --version
cargo clippy --version
nu --version
doc:
cargo doc --document-private-items --no-deps --open
build-examples:
cargo build --examples --release
# Komodo: Cryptographically-proven Erasure Coding
## the library
see `cargo doc` or [the library itself](src/)
[![release](https://gitlab.isae-supaero.fr/dragoon/komodo/-/badges/release.svg)](https://gitlab.isae-supaero.fr/dragoon/komodo/-/releases)
[![crate](https://img.shields.io/crates/v/komodo)](https://crates.io/crates/komodo)
[![docs](https://img.shields.io/docsrs/komodo)](https://docs.rs/komodo/latest/komodo/)
[![source](https://gitlab.isae-supaero.fr/dragoon/komodo/badges/main/pipeline.svg?key_text=GitLab%20CI)](https://gitlab.isae-supaero.fr/dragoon/komodo/-/pipelines)
[![mirror](https://github.com/dragoon-rs/komodo/actions/workflows/ci.yml/badge.svg)](https://github.com/dragoon-rs/komodo/actions)
## the tests
```shell
make
```
or
```shell
make check clippy test-rs
```
Komodo uses a build system entirely writen in Rust.
- [`cargo-script`](https://crates.io/crates/cargo-script) to build the script
- [`nob.rs`](https://gitlab.isae-supaero.fr/a.stevan/nob.rs) to run commands
- [`clap`](https://crates.io/crates/clap) to provide a nice and complete build API
### some extra tests
this project defines some tests written in [Nushell](https://www.nushell.sh/) to test an
[implementation of Komodo in a CLI application](bins/saclin/).
First, [install `cargo-script`](https://github.com/DanielKeep/cargo-script#installation).
If you have [Nushell installed](https://www.nushell.sh/book/installation.html), you can run these
with the following command:
Then, run the script with `./make.rs --help`
## the library
```shell
make test-nu
./make.rs doc
```
## examples
A [CLI example](bins/saclin/examples/cli.nu) is also provided and can be run with
## the tests
```shell
make example
./make.rs check
./make.rs clippy
./make.rs test
```
Other examples that showcase the Komodo API are available in [`examples/`](examples/).
......@@ -33,6 +32,28 @@ Other examples that showcase the Komodo API are available in [`examples/`](examp
## the benchmarks
see [`benchmarks/`](benchmarks/README.md)
the results can be found in [`dragoon/komodo-benchmark-results`](https://gitlab.isae-supaero.fr/dragoon/komodo-benchmark-results).
## development
Komodo uses a Docker image as the base of the GitLab pipelines.
That means that there is nothing to build apart from the source code of Komodo itself when running jobs.
When the development environment needs to change, e.g. when the version of Rust is bumped in
[`rust-toolchain.toml`](./rust-toolchain.toml), one shall run the following commands to push the new
Docker image to the [_container registry_][gitlab.isae-supaero.fr:dragoon/komodo@containers].
```shell
./make.rs container --login
```
```shell
./make.rs container
```
```shell
./make.rs container --push
```
## contributors
Because the code for this project has been originally extracted from
......@@ -45,3 +66,4 @@ note that the following people have contributed to this code base:
- @j.detchart
[pcs-fec-id]: https://gitlab.isae-supaero.fr/dragoon/pcs-fec-id
[gitlab.isae-supaero.fr:dragoon/komodo@containers]: https://gitlab.isae-supaero.fr/dragoon/komodo/container_registry/42
File added
File added
[package]
name = "benchmarks"
version = "1.0.0"
version = "1.0.1"
edition = "2021"
[dependencies]
......@@ -20,6 +20,7 @@ ark-secp256r1 = "0.4.0"
ark-std = "0.4.0"
ark-vesta = "0.4.0"
clap = { version = "4.5.4", features = ["derive"] }
komodo = { path = ".." }
komodo = { path = "..", features = ["fri"] }
plnk = { git = "https://gitlab.isae-supaero.fr/a.stevan/plnk", tag = "0.7.0", version = "0.7.0" }
rand = "0.8.5"
dragoonfri = { version = "0.1.0"}
# Table of contents
- [Requirements](#requirements)
- [Run the benchmarks](#run-the-benchmarks)
- [define them](#define-them)
- [run them](#run-them)
- [Plot the benchmarks](#plot-the-benchmarks)
## requirements
- install [GPLT](https://gitlab.isae-supaero.fr/a.stevan/gplt)
> :bulb: **Note**
>
> these should only be required for plotting results
```nushell
use .nushell/math.nu *
use .nushell/formats.nu *
- install [GPLT](https://gitlab.isae-supaero.fr/a.stevan/gplt)
- create a virtual environment
```bash
const VENV = "~/.local/share/venvs/gplt/bin/activate.nu" | path expand
```
## atomic operations
```nushell
cargo run --release --package benchmarks --bin field_operations -- --nb-measurements 1000 out> field.ndjson
cargo run --release --package benchmarks --bin curve_group_operations -- --nb-measurements 1000 out> curve_group.ndjson
```bash
virtualenv ($VENV | path dirname --num-levels 2)
```
```nushell
use .nushell/parse.nu read-atomic-ops
gplt multi_bar --title "simple field operations" -l "time (in ns)" (
open field.ndjson
| read-atomic-ops --exclude [ "exponentiation", "legendre", "inverse", "sqrt" ]
| to json
)
gplt multi_bar --title "complex field operations" -l "time (in ns)" (
open field.ndjson
| read-atomic-ops --include [ "exponentiation", "legendre", "inverse", "sqrt" ]
| to json
)
gplt multi_bar --title "simple curve group operations" -l "time (in ns)" (
open curve_group.ndjson
| read-atomic-ops --exclude [ "random sampling", "scalar multiplication", "affine scalar multiplication" ]
| to json
)
gplt multi_bar --title "complex curve group operations" -l "time (in ns)" (
open curve_group.ndjson
| read-atomic-ops --include [ "random sampling", "scalar multiplication", "affine scalar multiplication" ]
| to json
)
- activate the virtual environment
```bash
overlay use $VENV
```
- activate required modules
```bash
use benchmarks
```
## linear algebra
```nushell
let sizes = seq 0 7 | each { 2 ** $in }
let out_linalg = $sizes | benchmarks linalg run
> :bulb: **Note**
>
> i personally use the [`nuenv` hook](https://github.com/nushell/nu_scripts/blob/main/nu-hooks/nu-hooks/nuenv/hook.nu)
> that reads [`.env.nu`](../.env.nu).
benchmarks linalg plot $out_linalg inverse
```
## Run the benchmarks
### define them
## trusted setup and commit
```nushell
let degrees = seq 0 13 | each { 2 ** $in }
let curves = [ bls12381, pallas, bn254 ]
> :bulb: **Note**
>
> the FRI benchmarks don't use a module from [src/bin/](src/bin/) with PLNK but rather an
> [example](../examples/fri.rs)
let out_setup = $degrees | benchmarks setup run --curves $curves
let out_commit = $degrees | benchmarks commit run --curves $curves
```bash
const RESULTS_DIR = "/path/to/komodo-benchmark-results/"
benchmarks setup plot $out_setup
benchmarks commit plot $out_commit
let benchmarks = {
linalg: {
enabled: true,
sizes: (seq 0 7 | each { 2 ** $in }),
output: "linalg.ndjson",
append: true,
},
setup: {
enabled: true,
degrees: (seq 0 13 | each { 2 ** $in }),
curves: [ bls12381, pallas, bn254 ],
output: "setup.ndjson",
append: true,
},
commit: {
enabled: true,
degrees: (seq 0 13 | each { 2 ** $in }),
curves: [ bls12381, pallas, bn254 ],
output: "commit.ndjson",
append: true,
},
recoding: {
enabled: true,
sizes: (seq 0 18 | each { 512 * 2 ** $in }),
ks: [2, 4, 8, 16],
curves: [ bls12381 ],
output: "recoding.ndjson",
append: true,
},
fec: {
enabled: true,
sizes: (seq 0 18 | each { 512 * 2 ** $in }),
ks: [2, 4, 8, 16],
curves: [ bls12381 ],
output: "fec.ndjson",
append: true,
},
fri: {
enabled: true,
sizes: (seq 0 15 | each { 2 ** $in * 4096b }),
ks: [8, 128, 1024, 4096],
blowup_factors: [2, 4],
ns: [2],
remainder_plus_ones: [1],
nb_queries: [50],
hashes: ["sha3-512"],
ffs: ["fp128", "bls12-381"],
output: "fri.ndjson",
append: true,
},
field: {
enabled: true,
nb_measurements: 1000,
output: "field.ndjson",
append: true,
},
curve_group: {
enabled: true,
nb_measurements: 1000,
output: "curve_group.ndjson",
append: true,
},
}
```
## end-to-end benchmarks
```nushell
let sizes = seq 0 18 | each { 512 * 2 ** $in }
let ks = [2, 4, 8, 16]
let curves = [ bls12381 ]
### run them
```bash
benchmarks run --output-dir $RESULTS_DIR $benchmarks
```
### run
```nushell
let out_recoding = $sizes | benchmarks recoding run --ks $ks --curves $curves
let out_fec = $sizes | benchmarks fec run --ks $ks --curves $curves
```
> the following `watch` can be used to see the results as they are dumped to `$RESULTS_DIR`
> ```bash
> watch $RESULTS_DIR { |op, path|
> $"($op) ($path)"
> }
> ```
### plot
```nushell
benchmarks recoding plot $out_recoding
benchmarks fec plot encoding $out_fec
benchmarks fec plot decoding $out_fec
benchmarks fec plot e2e $out_fec
benchmarks fec plot combined $out_fec --recoding $out_recoding
benchmarks fec plot ratio $out_fec --recoding $out_recoding
## Plot the benchmarks
```bash
let plots = {
linalg: { file: "linalg.ndjson" },
setup: { file: "setup.ndjson" },
commit: { file: "commit.ndjson" },
fec: { file: "fec.ndjson" },
recoding: { file: "recoding.ndjson" },
fri: [
[name, y_type, single, identity, normalize];
[evaluating, duration, false, false, false ],
[encoding, duration, false, false, false ],
[proving, duration, false, false, false ],
[decoding, duration, false, false, false ],
[verifying, duration, true, false, false ],
[proofs, filesize, false, true, true ],
[commits, filesize, true, true, true ],
[proofs, filesize, false, true, false ],
[commits, filesize, true, true, false ],
],
field: {
title: "field operations",
file: field.ndjson,
simple_operations: [ "exponentiation", "legendre", "inverse", "sqrt" ],
},
curve_group: {
title: "curve group operations",
file: curve_group.ndjson,
simple_operations: [ "random sampling", "scalar multiplication", "affine scalar multiplication" ],
},
}
```
```bash
benchmarks plot $plots --input-dir "/path/to/komodo-benchmark-results/<hash>" --output-dir "./figures/"
```
......@@ -3,3 +3,262 @@ export module nu-lib/commit.nu
export module nu-lib/fec/
export module nu-lib/recoding.nu
export module nu-lib/linalg.nu
export module nu-lib/fri/
use nu-lib/linalg.nu
use nu-lib/setup.nu
use nu-lib/commit.nu
use nu-lib/recoding.nu
use nu-lib/fec/
use nu-lib/fri/
use nu-lib/utils/log.nu
use nu-lib/utils/parse.nu read-atomic-ops
const CPU_FIELDS = [
"Architecture",
"CPU op-mode(s)",
"Address sizes",
"Byte Order",
"CPU(s)",
"On-line CPU(s) list",
"Model name",
"CPU family",
"Model",
"Thread(s) per core",
"Core(s) per socket",
"Socket(s)",
"Stepping",
"CPU max MHz",
"CPU min MHz",
"BogoMIPS",
"Virtualization",
"L1d cache",
"L1i cache",
"L2 cache",
"L3 cache",
"NUMA node(s)",
"NUMA node0 CPU(s)",
]
export def run [
benchmarks: record<
linalg: record<
enabled: bool,
sizes: list<int>,
output: string,
append: bool,
>,
setup: record<
enabled: bool,
degrees: list<int>,
curves: list<string>,
output: string,
append: bool,
>,
commit: record<
enabled: bool,
degrees: list<int>,
curves: list<string>,
output: string,
append: bool,
>,
recoding: record<
enabled: bool,
sizes: list<int>,
ks: list<int>,
curves: list<string>,
output: string,
append: bool,
>,
fec: record<
enabled: bool,
sizes: list<int>,
ks: list<int>,
curves: list<string>,
output: string,
append: bool,
>,
fri: record<
enabled: bool,
sizes: list<filesize>,
ks: list<int>,
blowup_factors: list<int>,
ns: list<int>,
remainder_plus_ones: list<int>,
nb_queries: list<int>,
hashes: list<string>,
ffs: list<string>,
output: string,
append: bool,
>,
field: record<enabled: bool, nb_measurements: int, output: string, append: bool>,
curve_group: record<enabled: bool, nb_measurements: int, output: string, append: bool>,
>,
--output-dir: path = ".",
] {
let cpu = lscpu --json
| from json
| get lscpu
| update field { str trim --right --char ":" }
| transpose --header-row
| into record
| select ...$CPU_FIELDS
let commit = git rev-parse HEAD
let hash = $cpu | to json | $in + $commit | hash sha256
let target = $output_dir | path join $hash
mkdir $target
$cpu | to json | save --force ($target | path join "cpu.json")
$commit | save --force ($target | path join "komodo.txt")
let benchmarks = $benchmarks
| insert linalg.run {{ |it|
let output = $target | path join $it.output
$it.sizes | linalg run --no-confirm --output $output --append=$it.append
}}
| insert setup.run {{ |it|
let output = $target | path join $it.output
$it.degrees | setup run --curves $it.curves --no-confirm --output $output --append=$it.append
}}
| insert commit.run {{ |it|
let output = $target | path join $it.output
$it.degrees | commit run --curves $it.curves --no-confirm --output $output --append=$it.append
}}
| insert recoding.run {{ |it|
let output = $target | path join $it.output
$it.sizes | recoding run --ks $it.ks --curves $it.curves --no-confirm --output $output --append=$it.append
}}
| insert fec.run {{ |it|
let output = $target | path join $it.output
$it.sizes | fec run --ks $it.ks --curves $it.curves --no-confirm --output $output --append=$it.append
}}
| insert fri.run {{ |it|
# FIXME: refactor this
if $it.append {
(
fri run
--data-sizes $it.sizes
--ks $it.ks
--blowup-factors $it.blowup_factors
--nb-queries $it.nb_queries
--hashes $it.hashes
--finite-fields $it.ffs
--remainders $it.remainder_plus_ones
--folding-factors $it.ns
) | to ndjson out>> ($target | path join $it.output)
} else {
(
fri run
--data-sizes $it.sizes
--ks $it.ks
--blowup-factors $it.blowup_factors
--nb-queries $it.nb_queries
--hashes $it.hashes
--finite-fields $it.ffs
--remainders $it.remainder_plus_ones
--folding-factors $it.ns
) | to ndjson out> ($target | path join $it.output)
}
}}
| insert field.run {{ |it|
let options = [
--bin field
--release
--package benchmarks
--
--nb-measurements $it.nb_measurements
]
# FIXME: refactor this
if $it.append {
cargo run ...$options out>> ($target | path join $it.output)
} else {
cargo run ...$options out> ($target | path join $it.output)
}
}}
| insert curve_group.run {{ |it|
let options = [
--bin curve_group
--release
--package benchmarks
--
--nb-measurements $it.nb_measurements
]
# FIXME: refactor this
if $it.append {
cargo run ...$options out>> ($target | path join $it.output)
} else {
cargo run ...$options out> ($target | path join $it.output)
}
}}
let _ = $benchmarks | items { |k, b|
if ($b.enabled? | default true) {
log info $"running (ansi cyan)($k)(ansi reset)"
do $b.run $b
} else {
log warning $"skipping (ansi cyan)($k)(ansi reset)"
}
}
}
export def plot [plots: record, --input-dir: path, --output-dir: path = "./figures/"] {
mkdir $output_dir
let linalg_file = $input_dir | path join $plots.linalg.file
let fec_file = $input_dir | path join $plots.fec.file
let recoding_file = $input_dir | path join $plots.recoding.file
for op in [ "mul", "transpose", "inverse" ] {
linalg plot $linalg_file $op --save ($output_dir | path join $"linalg-($op).png")
}
setup plot ($input_dir | path join $plots.setup.file) --save ($output_dir | path join setup.png)
commit plot ($input_dir | path join $plots.commit.file) --save ($output_dir | path join commit.png)
recoding plot $recoding_file --save ($output_dir | path join recoding.png)
fec plot encoding $fec_file --save ($output_dir | path join encoding.png)
fec plot decoding $fec_file --save ($output_dir | path join decoding.png)
fec plot e2e $fec_file --save ($output_dir | path join end_to_end.png)
fec plot combined $fec_file --recoding $recoding_file --save ($output_dir | path join combined.png)
fec plot ratio $fec_file --recoding $recoding_file --save ($output_dir | path join ratio.png)
for plot in $plots.fri {(
fri plot
--dump-dir $output_dir
--file ($input_dir | path join fri.ndjson)
$plot.name
--y-type $plot.y_type
--single=$plot.single
--identity=$plot.identity
--normalize=$plot.normalize
--save
)}
for plot in ($plots | select field curve_group | values) {
def output [prefix: string]: [ nothing -> record<path: path, title: string> ] {
let title_tokens = $plot.title | split row " " | prepend $prefix
{
path: ({
parent: $output_dir,
stem: ($title_tokens | str join "_"),
extension: "png",
} | path join),
title: ($title_tokens | str join " "),
}
}
let data = open ($input_dir | path join $plot.file)
output "simple" | gplt multi-bar --title $in.title -l "time (in ns)" (
$data | read-atomic-ops --include $plot.simple_operations | to json
) --save $in.path
output "complex" | gplt multi-bar --title $in.title -l "time (in ns)" (
$data | read-atomic-ops --exclude $plot.simple_operations | to json
) --save $in.path
}
}
......@@ -2,6 +2,7 @@ use utils log
use utils math *
use utils fs check-file
use utils plot [ into-axis-options, COMMON_OPTIONS, gplt ]
use utils args check-list-arg
use std formats *
......@@ -12,22 +13,19 @@ use std formats *
export def run [
--output: path, # the output path (defaults to a random file in $nu.temp-path)
--curves: list<string>, # the curves to benchmark
--force, # does not ask for confirmation if the output file already exists, it will be overwritten
--no-confirm (-y), # does not ask for confirmation if the output file already exists, it will be overwritten
--nb-measurements: int = 10, # the number of measurements per benchmark run
--append, # append to the output path instead of overwritting
]: list<int> -> path {
let input = $in
if ($input | is-empty) or ($curves | is-empty) {
print "nothing to do"
return
}
$curves | check-list-arg --cmd "commit run" --arg "--curves" --span (metadata $curves).span
$in | check-list-arg --cmd "commit run" --arg "pipeline input"
let new_file = $output == null
let output = $output | default (mktemp --tmpdir komodo_commit.XXXXXX)
let pretty_output = $"(ansi purple)($output)(ansi reset)"
if ($output | path exists) and not $new_file {
log warning $"($pretty_output) already exists"
if not $force {
if not $no_confirm {
let res = ["no", "yes"] | input list $"Do you want to overwrite ($pretty_output)?"
if $res == null or $res == "no" {
log info "aborting"
......@@ -37,11 +35,20 @@ export def run [
}
}
cargo run --release --package benchmarks --bin commit -- ...[
let options = [
--release
--package benchmarks
--bin commit
--
--nb-measurements $nb_measurements
...$input
...$in
--curves ...$curves
] out> $output
]
if $append {
cargo run ...$options out>> $output
} else {
cargo run ...$options out> $output
}
log info $"results saved to ($pretty_output)"
$output
......@@ -65,7 +72,7 @@ export def plot [
| select name x y e
| group-by name --to-table
| reject items.name
| rename --column { group: "name", items: "points" }
| rename --column { name: "name", items: "points" }
| insert style.color {|it|
match $it.name {
"BLS12-381" => "tab:blue"
......
......@@ -24,7 +24,7 @@ export def encoding [
| sort-by x
| group-by k --to-table
| reject items.k
| rename --column { group: "name", items: "points" }
| rename --column { k: "name", items: "points" }
| update name { $"$k = ($in)$" }
let options = [
......@@ -56,7 +56,7 @@ export def decoding [
| sort-by x
| group-by k --to-table
| reject items.k
| rename --column { group: "name", items: "points" }
| rename --column { k: "name", items: "points" }
| update name { $"$k = ($in)$" }
let options = [
......@@ -89,7 +89,7 @@ export def e2e [
| update times { $it.items.0.times | zip $it.items.1.times | each { $in.0 + $in.1 } }
}
| flatten --all
| reject group foo
| reject foo
| ns-to-ms times
| compute-stats times
| reject times
......@@ -99,7 +99,7 @@ export def e2e [
| sort-by x
| group-by k --to-table
| reject items.k
| rename --column { group: "name", items: "points" }
| rename --column { k: "name", items: "points" }
| update name { $"$k = ($in)$" }
let options = [
......@@ -144,7 +144,7 @@ export def combined [
}
| reject items.shards
| insert style.line.type "solid"
| rename --column { group: "name", items: "points" }
| rename --column { shards: "name", items: "points" }
| update name { $"$k = ($in)$" }
let re_encoding_graphs = open --raw $data
......@@ -159,7 +159,7 @@ export def combined [
| update times { $it.items.0.times | zip $it.items.1.times | each { $in.0 + $in.1 } }
}
| flatten --all
| reject group key
| reject key
| ns-to-ms times
| compute-stats times
| reject times
......@@ -179,7 +179,7 @@ export def combined [
}
| insert style.line.type "dashed"
| reject items.k
| rename --column { group: "name", items: "points" }
| rename --column { k: "name", items: "points" }
| reject name
let graphs = $recoding_graphs
......@@ -254,7 +254,7 @@ export def ratio [
| update times { $it.items.0.times | zip $it.items.1.times | each { $in.0 + $in.1 } }
}
| flatten --all
| reject group key
| reject key
| ns-to-ms times
| compute-stats times
| where name == "BLS12-381"
......@@ -281,7 +281,7 @@ export def ratio [
}
}
| reject items.k
| rename --column { group: "name", items: "points" }
| rename --column { k: "name", items: "points" }
| update name { $"$k = ($in)$" }
let options = [
......
use ../utils log
use ../utils formats *
use ../utils args check-list-arg
use std formats *
......@@ -11,22 +12,20 @@ export def main [
--output: path, # the output path (defaults to a random file in $nu.temp-path)
--ks: list<int>, # the values of $k$ to benchmark
--curves: list<string>, # the curves to benchmark
--force, # does not ask for confirmation if the output file already exists, it will be overwritten
--no-confirm (-y), # does not ask for confirmation if the output file already exists, it will be overwritten
--nb-measurements: int = 10, # the number of measurements per benchmark run
--append, # append to the output path instead of overwritting
]: list<int> -> path {
let input = $in
if ($ks | is-empty) or ($input | is-empty) or ($curves | is-empty) {
print "nothing to do"
return
}
$ks | check-list-arg --cmd "fec run" --arg "--ks" --span (metadata $ks).span
$curves | check-list-arg --cmd "fec run" --arg "--curves" --span (metadata $curves).span
$in | check-list-arg --cmd "fec run" --arg "pipeline input"
let new_file = $output == null
let output = $output | default (mktemp --tmpdir komodo_fec.XXXXXX)
let pretty_output = $"(ansi purple)($output)(ansi reset)"
if ($output | path exists) and not $new_file {
log warning $"($pretty_output) already exists"
if not $force {
if not $no_confirm {
let res = ["no", "yes"] | input list $"Do you want to overwrite ($pretty_output)?"
if $res == null or $res == "no" {
log info "aborting"
......@@ -36,17 +35,29 @@ export def main [
}
}
"" out> $output
if not $append {
"" out> $output
}
let input = $in
for k in $ks {
cargo run --release --package benchmarks --bin fec -- ...[
let options = [
--release
--package benchmarks
--bin fec
--
--nb-measurements $nb_measurements
...$input
--encoding vandermonde
-k $k
-n 1
--curves ...$curves
] | from ndnuon | to ndjson out>> $output
]
if $append {
cargo run ...$options | from ndnuon | to ndjson out>> $output
} else {
cargo run ...$options | from ndnuon | to ndjson out> $output
}
}
log info $"results saved to ($pretty_output)"
......
export module run.nu
export module plot.nu
use std formats [ "from ndjson" ]
use ../utils plot [ into-axis-options, COMMON_OPTIONS ]
const NB_MS_IN_NS = 1_000_000
def plot [
name: string,
--save,
--y-type: string,
--single,
--identity,
--normalize,
--dump-dir: path,
] {
let ds = $in | get d | uniq
let graphs = $in
| select $name d k bf ff
| group-by { |it| $"($it.k):($it.ff):($it.bf)" }
| transpose name points
| update name {
let res = $in | parse "{k}:{ff}:{bf}" | into record
$"$k = ($res.k)$, $\\mathbb{F} = $ ($res.ff), $BF = ($res.bf)$"
}
| update points {
rename --column { $name: "y", d: "x" }
| update y { if $y_type == "duration" { $in / $NB_MS_IN_NS } else { $in } }
| if $single { update y { |it| $it.y / ($it.k * $it.bf) } } else { $in }
| if $normalize { update y { |it| $it.y / $it.x } } else { $in }
| sort-by x
}
| insert style { |it|
let type = match $it.points.ff.0 {
"fp128" => "solid",
"bls12-381" => "dashed",
_ => "",
}
let color = match $it.points.k.0 {
8 => "tab:blue",
128 => "tab:green",
1024 => "tab:orange",
4096 => "tab:red",
_ => "grey",
}
let marker = match $it.points.bf.0 {
2 => "o",
4 => "s",
_ => "*",
}
{ color: $color, line: { type: $type, marker: { shape: $marker } } }
}
| if $identity { append {
name: "$x \\mapsto x$",
points: ($ds | wrap x | merge ($ds | wrap y) | if $normalize { update y { |it| $it.y / $it.x } } else { $in }),
style: { color: "black", line: { type: "dotted" } },
} } else { $in }
| reject points.k? points.bf? points.ff?
let title = [
$name,
(if $single { "single" }),
(if $normalize { "normalized" }),
] | compact | str join '_'
let y_type = if $normalize { "plain" } else { $y_type }
let options = [
...($graphs.points | flatten | into-axis-options -x "filesize" -y $y_type)
--use-tex
--y-scale log
--x-scale log
--x-scale-base 2
--y-scale-base 2
--title $title
...(if $save { [ --save ($dump_dir | path join $"($title).png") ] } else {[]})
--fullscreen
]
$graphs | to json | gplt plot $in ...($options | compact)
}
export def main [
...x,
--file: path,
--y-type: string = "plain",
--single,
--identity,
--normalize,
--dump-dir: path = "./",
--save,
] {
if ($x | is-empty) {
error make --unspanned { msg: "nothing to do, x is empty" }
}
if $file == null {
error make --unspanned { msg: "missing --file" }
}
if not ($dump_dir | path exists) {
mkdir $dump_dir
}
let data = open $file | where h == "sha3-512" and q == 50
for i in $x {
$data | plot --save=$save $i --y-type=$y_type --single=$single --identity=$identity --normalize=$normalize --dump-dir=$dump_dir
}
}
use std iter
def "cartesian product" [
iters: list # the iterables you want the cartesian product of
]: nothing -> list {
def aux [a: list]: nothing -> list {
if ($a | is-empty) {
return []
}
let head = $a | first
let tail = aux ($a | skip 1)
if ($head | is-empty) {
return $tail
} else if ($tail | is-empty) {
return $head
}
$head | each {|h| $tail | each {|t| [$h, $t]}} | flatten | each { flatten }
}
aux $iters
}
# returns a record with all numeric results, merged with the parameters
def run [
params: record<
d: filesize, k: int, bf: int, q: int, h: string, ff: string, n: int, rpo: int
>
] {
cargo run --quiet --release --example fri --features fri -- ...[
--data-size ($params.d | into int)
-k $params.k
--blowup-factor $params.bf
--remainder-degree-plus-one $params.rpo
--folding-factor $params.n
--nb-queries $params.q
--hash $params.h
--finite-field $params.ff
]
| lines
| parse "{k}: {v}"
| into int v
| transpose --header-row
| into record
| merge $params
}
export def main [
--data-sizes: list<filesize>,
--ks: list<int>,
--blowup-factors: list<int>,
--nb-queries: list<int>,
--hashes: list<string>,
--finite-fields: list<string>,
--folding-factors: list<int>,
--remainders: list<int>,
] {
let inputs = [
$data_sizes, $ks, $blowup_factors, $nb_queries, $hashes, $finite_fields,
$folding_factors, $remainders
]
if ($inputs | any { is-empty }) {
error make --unspanned { msg: "one of the inputs is empty" }
}
let params = cartesian product $inputs | each { |params|
[d, k, bf, q, h, ff, n, rpo]
| iter zip-into-record $params
| into record
}
$params | each { |p|
print ($p | to nuon --raw)
run $p
}
}
......@@ -2,6 +2,7 @@ use utils log
use utils math *
use utils fs check-file
use utils plot [ into-axis-options, COMMON_OPTIONS, gplt ]
use utils args check-list-arg
use std formats *
......@@ -11,22 +12,18 @@ use std formats *
# - output: the output path, as NDJSON
export def run [
--output: path, # the output path (defaults to a random file in $nu.temp-path)
--force, # does not ask for confirmation if the output file already exists, it will be overwritten
--no-confirm (-y), # does not ask for confirmation if the output file already exists, it will be overwritten
--nb-measurements: int = 10, # the number of measurements per benchmark run
--append, # append to the output path instead of overwritting
]: list<int> -> path {
let input = $in
if ($input | is-empty) {
print "nothing to do"
return
}
$in | check-list-arg --cmd "linalg run" --arg "pipeline input"
let new_file = $output == null
let output = $output | default (mktemp --tmpdir komodo_linalg.XXXXXX)
let pretty_output = $"(ansi purple)($output)(ansi reset)"
if ($output | path exists) and not $new_file {
log warning $"($pretty_output) already exists"
if not $force {
if not $no_confirm {
let res = ["no", "yes"] | input list $"Do you want to overwrite ($pretty_output)?"
if $res == null or $res == "no" {
log info "aborting"
......@@ -36,10 +33,19 @@ export def run [
}
}
cargo run --release --package benchmarks --bin linalg -- ...[
let options = [
--release
--package benchmarks
--bin linalg
--
--nb-measurements $nb_measurements
...$input
] out> $output
...$in
]
if $append {
cargo run ...$options out>> $output
} else {
cargo run ...$options out> $output
}
log info $"results saved to ($pretty_output)"
$output
......@@ -89,7 +95,8 @@ export def plot [
| where op == $op
| rename --column { n: "x", mean: "y", stddev: "e" }
| group-by name --to-table
| rename --column { group: "name", items: "points" }
| reject items.name items.op items.times
| rename --column { name: "name", items: "points" }
| insert style.color {|it|
match $it.name {
"BLS12-381" => "tab:blue"
......