Merge pull request #612 from LedgerHQ/develop

App release 1.11.0
This commit is contained in:
apaillier-ledger
2024-07-25 14:02:24 +02:00
committed by GitHub
2546 changed files with 5474 additions and 16400 deletions

View File

@@ -1,7 +1,6 @@
---
BasedOnStyle: Google
IndentWidth: 4
---
Language: Cpp
ColumnLimit: 100
PointerAlignment: Right
@@ -12,7 +11,6 @@ SortIncludes: false
SpaceAfterCStyleCast: true
AllowShortCaseLabelsOnASingleLine: false
AllowAllArgumentsOnNextLine: false
AllowAllParametersOfDeclarationOnNextLine: false
AllowShortBlocksOnASingleLine: Never
AllowShortFunctionsOnASingleLine: None
BinPackArguments: false

View File

@@ -19,8 +19,8 @@ Describe your issue in as much detail as possible here.
## Steps to reproduce
* Tell us how to reproduce this issue <br />
* Where the issue is, if you know <br />
* Tell us how to reproduce this issue
* Where the issue is, if you know
* Which commands triggered the issue, if any
## Expected behaviour
@@ -37,4 +37,5 @@ Please paste any logs here that demonstrate the issue, if they exist
## Proposed solution
If you have an idea of how to fix this issue, please write it down here, so we can begin discussing it
If you have an idea of how to fix this issue, please write it down here,
so we can begin discussing it

View File

@@ -1,6 +1,6 @@
---
name: Feature report
about: Suggest an idea for this project
about: Suggest an idea for this project
title: 'Add [Subject of the issue]'
labels: 'enhancement'
assignees: ''

View File

@@ -1,6 +1,6 @@
---
name: Network request
about: Request of new chain or network
about: Request of new chain or network
title: 'Add {NameChain} network ({SYMBOL})'
labels: 'enhancement, chain/network'
assignees: ''
@@ -8,12 +8,13 @@ assignees: ''
---
## Description
- [ ] Symbol:
- [ ] ChainId:
- [ ] Website:
- [ ] Block-Explorer:
- [ ] [slip-0044](https://github.com/satoshilabs/slips/blob/master/slip-0044.md) type:
- [ ] Symbol:
- [ ] ChainId:
- [ ] Website:
- [ ] Block-Explorer:
- [ ] [slip-0044](https://github.com/satoshilabs/slips/blob/master/slip-0044.md) type:
## Additional comments
Please post additional comments in this section if you have them, otherwise delete it.
Please post additional comments in this section if you have them, otherwise delete it.

View File

@@ -1,119 +0,0 @@
name: 'Commit and push if the version file has changed'
inputs:
name:
description: 'The name of the commiter'
required: true
default: 'github-actions[bot]'
email:
description: 'The email of the commiter'
required: true
default: 'github-actions[bot]@users.noreply.github.com'
message:
description: 'The commit message'
required: true
default: 'New release version(s)'
files:
description: 'The file(s) to add in the commit'
required: true
default: '*'
directory:
description: 'The directory in which the action will be performed'
required: true
default: '.'
src_branch:
description: 'Checkout (or create) a specific branch before commit/push. Defaults to current branch'
required: false
default: ''
dst_branch:
description: 'Push the created commit on a specific branch. Defaults to current branch'
required: false
default: ''
secret:
description: 'A token allowing to push the commit on the repository'
required: true
default: '.'
repository:
description: 'The repository where to push'
required: true
default: ''
runs:
using: 'composite'
steps:
- name: Commit the changes
id: commit
run: |
git config --global user.name ${{ inputs.name }}
ORIGIN="$(pwd)"
cd ${{ inputs.directory }}
CURRENT_BRANCH=${GITHUB_REF#refs/heads/};
# calculating source branch
if [ -n "${{ inputs.src_branch }}" ]; \
then \
git switch ${{ inputs.src_branch }} 2>/dev/null || git switch -c ${{ inputs.src_branch }}; \
SRC_BRANCH=${{ inputs.src_branch }}; \
else \
SRC_BRANCH=`git branch --show-current`; \
if [ -z "$SRC_BRANCH" ]; \
then \
SRC_BRANCH=$CURRENT_BRANCH; \
fi \
fi
# calculating destination branch
if [ -n "${{ inputs.dst_branch }}" ]; \
then \
DST_BRANCH=${{ inputs.dst_branch }}; \
else \
DST_BRANCH=`git branch --show-current`; \
if [ -z "$DST_BRANCH" ]; \
then \
DST_BRANCH=$CURRENT_BRANCH; \
fi \
fi
echo "-----------------------------------------------------------"
echo "Initial repo status"
git status
# checking changes, commit if needed
CHANGES="$(git status --porcelain ${{ inputs.files }})"
if [ -n "${CHANGES}" ]; \
then \
echo -e "Changes:\n${CHANGES}"; \
git add ${{ inputs.files }}; \
echo "-----------------------------------------------------------"; \
echo "Repo status before commit"; \
git status; \
git commit -am "${{ inputs.message }}"; \
fi
# compute if a push is needed
if [ -n "${CHANGES}" -o "$SRC_BRANCH" != "$DST_BRANCH" ]; \
then \
PUSH="YES"; \
else \
PUSH="NO"; \
fi
git log -n 2
cd "${ORIGIN}"
echo " -- Env SRC_BRANCH: $SRC_BRANCH";
echo " -- Env DST_BRANCH: $DST_BRANCH";
echo " -- Env PUSH: $PUSH"
# exporting these variables for next steps
echo "##[set-output name=src_branch;]$(echo $SRC_BRANCH)";
echo "##[set-output name=dst_branch;]$(echo $DST_BRANCH)";
echo "##[set-output name=push;]$(echo $PUSH)";
shell: bash
- name: Push commit
if: steps.commit.outputs.push == 'YES'
uses: ad-m/github-push-action@master
with:
github_token: ${{ inputs.secret }}
branch: ${{ steps.commit.outputs.dst_branch }}
directory: ${{ inputs.directory }}
repository: ${{ inputs.repository }}

View File

@@ -1,6 +1,6 @@
## Description
Please provide a detailed description of what was done in this PR.
Please provide a detailed description of what was done in this PR.
(And mentioned if linked to an issue [docs](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue))
## Changes include
@@ -18,4 +18,4 @@ Please complete this section if any breaking changes have been made, otherwise d
## Additional comments
Please post additional comments in this section if you have them, otherwise delete it.
Please post additional comments in this section if you have them, otherwise delete it.

View File

@@ -1,15 +0,0 @@
---
name: 'Auto Author Assign'
on:
pull_request_target:
types: [opened, reopened]
permissions:
pull-requests: write
jobs:
assign-author:
runs-on: ubuntu-latest
steps:
- uses: toshimaru/auto-author-assign@v1.6.1

View File

@@ -0,0 +1,52 @@
name: Build and run functional tests using ragger through reusable workflow
# This workflow will build the app and then run functional tests using the Ragger framework upon Speculos emulation.
# It calls a reusable workflow developed by Ledger's internal developer team to build the application and upload the
# resulting binaries.
# It then calls another reusable workflow to run the Ragger tests on the compiled application binary.
#
# While this workflow is optional, having functional testing on your application is mandatory and this workflow and
# tooling environment is meant to be easy to use and adapt after forking your application
on:
workflow_dispatch:
push:
branches:
- master
- main
- develop
pull_request:
jobs:
build_application:
name: Build application using the reusable workflow
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_build.yml@v1
with:
upload_app_binaries_artifact: "ragger_elfs"
flags: "CAL_TEST_KEY=1 DOMAIN_NAME_TEST_KEY=1 SET_PLUGIN_TEST_KEY=1 NFT_TEST_KEY=1"
ragger_tests:
name: Run ragger tests using the reusable workflow
needs: build_application
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_ragger_tests.yml@v1
with:
download_app_binaries_artifact: "ragger_elfs"
build_clone_app:
name: Build Clone app using the reusable workflow
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_build.yml@v1
with:
flags: "CHAIN=thundercore"
upload_app_binaries_artifact: "clone_elfs"
ragger_clone_tests:
name: Run ragger Clone tests using the reusable workflow
needs:
- build_application
- build_clone_app
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_ragger_tests.yml@v1
with:
download_app_binaries_artifact: "ragger_elfs"
additional_app_binaries_artifact: "clone_elfs"
additional_app_binaries_artifact_dir: ./tests/ragger/.test_dependencies/clone/build/
test_options: "--with_lib_mode"

16
.github/workflows/check_sdk.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
---
name: Check SDK submodule version
on:
workflow_dispatch:
push:
branches:
- master
- main
- develop
pull_request:
jobs:
job_check_SDK:
name: Check Ethereum plugin SDK submodule is up-to-date
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_check_ethereum_sdk.yml@v1

View File

@@ -1,201 +0,0 @@
---
name: Tests
on:
workflow_dispatch:
push:
branches:
- master
- develop
pull_request:
jobs:
# =====================================================
# ZEMU TESTS
# =====================================================
building_for_e2e_zemu_tests:
name: Building binaries for E2E Zemu tests
runs-on: ubuntu-latest
container:
image: ghcr.io/ledgerhq/ledger-app-builder/ledger-app-builder-lite:latest
steps:
- uses: actions/checkout@v3
- name: Build testing binaries
run: |
git config --global --add safe.directory "$GITHUB_WORKSPACE"
cd tests/zemu/ && ./build_local_test_elfs.sh
- name: Upload app binaries
uses: actions/upload-artifact@v3
with:
name: e2e_zemu_elfs
path: ./tests/zemu/elfs/
jobs-e2e-zemu-tests:
name: E2E Zemu tests
needs: [building_for_e2e_zemu_tests]
runs-on: ubuntu-latest
steps:
- name: Test
run: |
id
echo $HOME
echo $DISPLAY
- name: Checkout
uses: actions/checkout@v3
- run: sudo apt-get update -y && sudo apt-get install -y libusb-1.0.0 libudev-dev
- name: Install NodeJS
uses: actions/setup-node@v3
with:
node-version: "16"
- name: Install yarn
run: npm install -g yarn
- name: Build/Install build js deps
run: cd tests/zemu/ && yarn install
- name: Create tmp folder for artifacts
run: mkdir tests/zemu/elfs
- name: Download app binaries
uses: actions/download-artifact@v3
with:
path: tmp/
- name: Gather elfs
run: cp `find tmp/e2e_zemu_elfs/ -name "*.elf"` tests/zemu/elfs/
- name: Run zemu tests
run: cd tests/zemu/ && yarn test
# =====================================================
# SPECULOS TESTS
# =====================================================
building_for_e2e_speculos_tests:
name: Building binaries for E2E Speculos tests
runs-on: ubuntu-latest
container:
image: ghcr.io/ledgerhq/ledger-app-builder/ledger-app-builder-lite:latest
steps:
- uses: actions/checkout@v3
- name: Build testing binaries
run: |
mkdir tests/speculos/elfs
make clean && make -j DEBUG=1 NFT_STAGING_KEY=1 BOLOS_SDK=$NANOS_SDK && mv bin/app.elf tests/speculos/elfs/nanos.elf
make clean && make -j DEBUG=1 NFT_STAGING_KEY=1 BOLOS_SDK=$NANOX_SDK && mv bin/app.elf tests/speculos/elfs/nanox.elf
make clean && make -j DEBUG=1 NFT_STAGING_KEY=1 BOLOS_SDK=$NANOSP_SDK && mv bin/app.elf tests/speculos/elfs/nanosp.elf
- name: Upload app binaries
uses: actions/upload-artifact@v3
with:
name: e2e_speculos_elfs
path: ./tests/speculos/elfs
jobs-e2e-speculos-tests:
name: Speculos tests
strategy:
fail-fast: false
matrix:
model: ["nanosp", "nanos", "nanox"]
needs: [building_for_e2e_speculos_tests]
runs-on: ubuntu-latest
steps:
- name: Clone
uses: actions/checkout@v3
- name: Create tmp folder for artifacts
run: mkdir tests/speculos/elfs
- name: Download app binaries
uses: actions/download-artifact@v3
with:
path: tmp/
- name: Gather elfs
run: cp `find tmp/e2e_speculos_elfs/ -name "*.elf"` tests/speculos/elfs/
- name: Install dependencies
run: |
cd tests/speculos
sudo apt-get update && sudo apt-get install -y qemu-user-static
pip install -r requirements.txt
- name: Run speculos tests
run: |
cd tests/speculos
pytest --model ${{ matrix.model }} --path ./elfs/${{ matrix.model }}.elf --display headless
# =====================================================
# RAGGER TESTS
# =====================================================
build_ragger_elfs:
name: Build app for Ragger tests
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_build.yml@v1
with:
upload_app_binaries_artifact: "ragger_elfs"
flags: "DEBUG=1 CAL_TEST_KEY=1 DOMAIN_NAME_TEST_KEY=1 SET_PLUGIN_TEST_KEY=1 NFT_TEST_KEY=1"
jobs-ragger-tests:
name: Run Ragger tests
needs: build_ragger_elfs
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_ragger_tests.yml@v1
with:
download_app_binaries_artifact: "ragger_elfs"
# =====================================================
# STATIC ANALYSIS
# =====================================================
# Static analysis on the main ETH chain is covered by the guidelines enforcer
scan-build:
name: Clang Static Analyzer on altcoin
runs-on: ubuntu-latest
container:
image: ghcr.io/ledgerhq/ledger-app-builder/ledger-app-builder:latest
strategy:
fail-fast: false
matrix:
device: ["nanos", "nanos2", "nanox", "stax"]
steps:
- name: Clone
uses: actions/checkout@v3
with:
submodules: recursive
- name: Build with Clang Static Analyzer
run: |
eval "BOLOS_SDK=\$$(echo ${{ matrix.device }} | tr [:lower:] [:upper:])_SDK" && \
echo "BOLOS_SDK value will be: ${BOLOS_SDK}" && \
make -j ENABLE_SDK_WERROR=1 BOLOS_SDK=${BOLOS_SDK} CHAIN=polygon scan-build
- uses: actions/upload-artifact@v3
if: failure()
with:
name: scan-build
path: scan-build
- name: Upload scan result
if: failure()
uses: actions/upload-artifact@v3
with:
name: scan-build
path: scan-build

47
.github/workflows/codeql_checks.yml vendored Normal file
View File

@@ -0,0 +1,47 @@
name: "CodeQL"
on:
workflow_dispatch:
push:
branches:
- master
- main
- develop
pull_request:
# Excluded path: add the paths you want to ignore instead of deleting the workflow
paths-ignore:
- '.github/workflows/*.yml'
- 'tests/*'
jobs:
analyse:
name: Analyse
strategy:
fail-fast: false
matrix:
sdk: ["$NANOS_SDK", "$NANOX_SDK", "$NANOSP_SDK", "$STAX_SDK", "$FLEX_SDK"]
# 'cpp' covers C and C++
language: ['cpp']
runs-on: ubuntu-latest
container:
image: ghcr.io/ledgerhq/ledger-app-builder/ledger-app-builder-legacy:latest
steps:
- name: Clone
uses: actions/checkout@v4
with:
submodules: true
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
queries: security-and-quality
# CodeQL will create the database during the compilation
- name: Build
run: |
make BOLOS_SDK=${{ matrix.sdk }}
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3

26
.github/workflows/codespell.yml vendored Normal file
View File

@@ -0,0 +1,26 @@
name: Misspellings CI
on:
workflow_dispatch:
push:
branches:
- master
- main
- develop
pull_request:
jobs:
misspell:
name: Check misspellings
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
submodules: true
- name: Check misspellings
uses: codespell-project/actions-codespell@v2
with:
builtin: clear,rare
check_filenames: true
path: src, src_bagl, src_features, src_nbgl, src_plugin_sdk, src_plugins, doc, client

View File

@@ -0,0 +1,23 @@
name: Ensure compliance with Ledger guidelines
# This workflow is mandatory in all applications
# It calls a reusable workflow guidelines_enforcer developed by Ledger's internal developer team.
# The successful completion of the reusable workflow is a mandatory step for an app to be available on the Ledger
# application store.
#
# More information on the guidelines can be found in the repository:
# LedgerHQ/ledger-app-workflows/
on:
workflow_dispatch:
push:
branches:
- master
- main
- develop
pull_request:
jobs:
guidelines_enforcer:
name: Call Ledger guidelines_enforcer
uses: LedgerHQ/ledger-app-workflows/.github/workflows/reusable_guidelines_enforcer.yml@v1

View File

@@ -24,3 +24,10 @@ jobs:
source: './'
extensions: 'h,c'
version: 12
yamllint:
name: Check yaml files
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: yamllint .

View File

@@ -39,8 +39,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout SDK repository
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
submodules: true
repository: LedgerHQ/ethereum-plugin-sdk
ref: develop
- name: Retrieve SDK reference
@@ -96,7 +97,7 @@ jobs:
steps:
- name: Checkout plugin repository
uses: actions/checkout@v3
uses: actions/checkout@v4
with:
repository: LedgerHQ/${{ matrix.repo }}
submodules: recursive
@@ -144,7 +145,7 @@ jobs:
gh label create 'auto' --color 'b4a8d1' --description 'Automatically created'
fi
- name: Create pull request and commment on SDK issue
- name: Create pull request and comment on SDK issue
run: |
# Github limits the number of possible PR being opened in a given time window.
# The limits are 20 creation per minute and 150 per hour.

View File

@@ -23,7 +23,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Clone
uses: actions/checkout@v3
uses: actions/checkout@v4
- run: pip install flake8 flake8-pyproject
- name: Flake8 lint Python code
run: (cd client && flake8 src/)
@@ -33,7 +33,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Clone
uses: actions/checkout@v3
uses: actions/checkout@v4
- run: pip install mypy
- name: Mypy type checking
run: (cd client && mypy src/)
@@ -44,37 +44,37 @@ jobs:
needs: [lint, mypy]
steps:
- name: Clone
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Clone
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Build Python package
run: |
pip install --upgrade pip build twine
cd client/
python -m build;
python -m twine check dist/*
pip install .;
echo "TAG_VERSION=$(python -c 'from ledger_app_clients.ethereum import __version__; print(__version__)')" >> "$GITHUB_ENV"
- name: Build Python package
run: |
pip install --upgrade pip build twine
cd client/
python -m build;
python -m twine check dist/*
pip install .;
echo "TAG_VERSION=$(python -c 'from ledger_app_clients.ethereum import __version__; print(__version__)')" >> "$GITHUB_ENV"
- name: Check version against CHANGELOG
if: startsWith(github.ref, 'refs/tags/')
run: |
CHANGELOG_VERSION=$(grep -Po '(?<=## \[)(\d+\.)+[^\]]' client/CHANGELOG.md | head -n 1)
if [ "${{ env.TAG_VERSION }}" == "${CHANGELOG_VERSION}" ];
then
echo 'Package and CHANGELOG versions match!';
exit 0;
else
echo "Tag '${{ env.TAG_VERSION }}' and CHANGELOG '${CHANGELOG_VERSION}' versions mismatch!";
exit 1;
fi
- name: Check version against CHANGELOG
if: startsWith(github.ref, 'refs/tags/')
run: |
CHANGELOG_VERSION=$(grep -Po '(?<=## \[)(\d+\.)+[^\]]' client/CHANGELOG.md | head -n 1)
if [ "${{ env.TAG_VERSION }}" == "${CHANGELOG_VERSION}" ];
then
echo 'Package and CHANGELOG versions match!';
exit 0;
else
echo "Tag '${{ env.TAG_VERSION }}' and CHANGELOG '${CHANGELOG_VERSION}' versions mismatch!";
exit 1;
fi
- name: Publish Python package on pypi.org
if: success() && github.event_name == 'push'
run: (cd client && python -m twine upload --verbose dist/*)
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_PUBLIC_API_TOKEN }}
TWINE_NON_INTERACTIVE: 1
- name: Publish Python package on pypi.org
if: success() && github.event_name == 'push'
run: (cd client && python -m twine upload --verbose dist/*)
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_PUBLIC_API_TOKEN }}
TWINE_NON_INTERACTIVE: 1

View File

@@ -1,52 +0,0 @@
---
name: Updating the SDK
on:
workflow_dispatch:
push:
branches:
- master
- develop
jobs:
updating_SDK:
name: Updating the SDK
runs-on: ubuntu-latest
steps:
- name: Clone
uses: actions/checkout@v3
with:
# by default the action uses fetch-depth = 1, which creates
# shallow repositories from which we can't push
fetch-depth: 0
submodules: recursive
# needed, else the push inside the action will use default credentials
# instead of provided ones
persist-credentials: false
- name: Build new SDK
run: ./tools/build_sdk.sh
- name: Extract branch name
shell: bash
run: echo "##[set-output name=branch;]$(echo ${GITHUB_REF#refs/heads/})"
id: extract_branch
- name: Commit & push changes in the SDK if any
uses: ./.github/actions/commit-changes
with:
name: 'ldg-github-ci'
directory: ethereum-plugin-sdk
dst_branch: ${{ steps.extract_branch.outputs.branch }}
message: "[update] Branch ${{ steps.extract_branch.outputs.branch }} | Commit ${GITHUB_SHA}"
secret: ${{ secrets.CI_BOT_TOKEN }}
repository: LedgerHQ/ethereum-plugin-sdk
- name: Create SDK update pull request
uses: peter-evans/create-pull-request@v4
with:
branch: sdk/update-submodule
delete-branch: true
title: Update the SDK submodule
reviewers: apailler-ledger

19
.gitignore vendored
View File

@@ -1,22 +1,17 @@
# Compilation of Ledger's app
src/glyphs.c
src/glyphs.h
bin/
debug/
dep/
obj/
build/
# Unit tests and code coverage
tests/unit/build/
tests/unit/coverage/
tests/unit/coverage.*
# Python
venv/
*.pyc
__version__.py
# JS
tests/node_modules
tests/lib
tests/yarn-error.log
tests/elfs/*
tests/snapshots-tmp
.vscode
.idea
.idea

11
.mdl.rb Normal file
View File

@@ -0,0 +1,11 @@
# Style file for mdl
# https://github.com/markdownlint/markdownlint/blob/main/docs/creating_styles.md
# Include all rules
all
# Disable specific rules
#exclude_rule 'MD012'
# Update rules configuration
rule 'MD013', :line_length => 120

14
.mdlrc Normal file
View File

@@ -0,0 +1,14 @@
# markdownlint config file
# Use custom style file
style "#{File.dirname(__FILE__)}/.mdl.rb"
# MD002 - First header in file should be a top level header
# MD005 - Inconsistent indentation for list items at the same level
# MD007 - Unordered list indentation
# MD014 - Dollar signs used before commands without showing output
# MD024 - Multiple headers with the same content
# MD029 - Ordered list item prefix
# MD033 - Inline HTML
# MD041 - First line in file should be a top level header
rules "~MD002,~MD005,~MD007,~MD014,~MD024,~MD029,~MD033,~MD041"

47
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,47 @@
# To install hooks, run:
# pre-commit install --hook-type pre-commit
# pre-commit install --hook-type commit-msg
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: mixed-line-ending
- id: check-added-large-files
- id: check-merge-conflict
- id: check-case-conflict
- repo: https://github.com/codespell-project/codespell
rev: v2.2.6
hooks:
- id: codespell
args: ['--ignore-words-list', 'ontop,shft,hte', '--skip', 'makefile_conf/chain/*,tests/ragger/eip712_input_files/*']
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: v12.0.1
hooks:
- id: clang-format
types_or: [c]
- repo: https://github.com/Mateusz-Grzelinski/actionlint-py
rev: v1.6.27.13
hooks:
- id: actionlint
types_or: [yaml]
args: [-shellcheck='' -pyflakes='']
- repo: https://github.com/markdownlint/markdownlint
rev: v0.12.0
hooks:
- id: markdownlint
types_or: [markdown]
- repo: https://github.com/PyCQA/pylint
rev: v2.16.2
hooks:
- id: pylint
types: [python]
args: ['--jobs=0', '--rcfile=tests/ragger/setup.cfg']
files: '^tests/ragger/.*$'

7
.yamllint.yml Normal file
View File

@@ -0,0 +1,7 @@
---
extends: default
rules:
document-start: disable
line-length: disable
truthy: disable

View File

@@ -5,7 +5,52 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](http://keepachangelog.com/)
and this project adheres to [Semantic Versioning](http://semver.org/).
## [1.10.4](https://github.com/ledgerhq/app-ethereum/compare/1.10.3...1.10.4) - 2023-03-08
## [1.11.0](https://github.com/ledgerhq/app-ethereum/compare/1.10.4...1.11.0) - 2024-07-24
### Added
- (network) Base Sepolia
- (network) Blast
- (network) Blast Sepolia
- (network) Mantle
- (network) Mantle Sepolia
- (network) Arbitrum Sepolia
- (network) Linea Sepolia
- (network) OP Sepolia
- (network) Etherlink Mainnet
- (network) ZetaChain
- (network) Astar zkEVM
- (network) Lisk
- (network) Lisk Sepolia
- (network) ZKsync
- (network) BOB
- (network) Electroneum
- New EIP-712 filtering modes (datetime, amount-join)
- New blind-signing warning flow before every blind-signed transaction flow
- New "From" field in transactions containing the wallet's derived address
- Ledger Flex support
### Removed
- (clone) Flare
- (clone) Flare Coston
- (clone) Eth Goerli
- (clone) Eth Ropsten
- Wallet ID support
- U2F support
- Blind-signing setting
### Changed
- Renamed Optimism to OP Mainnet
- Can now store up to 5 assets information (instead of 2)
- Can now buffer & show multiple EIP-712 fields on one page for NBGL devices
- Renamed the "Address" field in transactions to "To"
### Fixed
- Handling of EIP-712 empty arrays within nested structs
## [1.10.4](https://github.com/ledgerhq/app-ethereum/compare/1.10.3...1.10.4) - 2024-03-08
### Added
@@ -122,7 +167,8 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
### Changed
- EIP-191 improvements, now lets the user see the entire message one chunk at a time (255 characters for LNX & LNS+, 99 for LNS)
- EIP-191 improvements, now lets the user see the entire message one chunk at a time
(255 characters for LNX & LNS+, 99 for LNS)
### Fixed
@@ -143,7 +189,8 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
### Changed
- EIP-191 signatures now show (up to 99 characters on LNS and 255 on LNX & LNS+) the actual data contained in the message (clear-signing)
- EIP-191 signatures now show (up to 99 characters on LNS and 255 on LNX & LNS+) the actual data
contained in the message (clear-signing)
### Fixed
@@ -209,7 +256,7 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
### Added
- Provide network ticker to plugins (especialy helpful for Paraswap plugin)
- Provide network ticker to plugins (especially helpful for Paraswap plugin)
- Polygon variant
## [1.9.10](https://github.com/ledgerhq/app-ethereum/compare/1.9.9...1.9.10) - 2021-10-08
@@ -254,7 +301,8 @@ and this project adheres to [Semantic Versioning](http://semver.org/).
### Added
- When blind signing is disabled in settings, and a transaction with smart conract interactions is sent to the app, a new warning screen pops to let the user know that the setting must be enabled to sign this kind of transactions.
- When blind signing is disabled in settings, and a transaction with smart conract interactions is sent to the app,
a new warning screen pops to let the user know that the setting must be enabled to sign this kind of transactions.
## [1.9.4](https://github.com/ledgerhq/app-ethereum/compare/1.9.3...1.9.4) - 2021-9-14

395
Makefile
View File

@@ -21,292 +21,41 @@ endif
include $(BOLOS_SDK)/Makefile.defines
DEFINES_LIB = USE_LIB_ETHEREUM
APP_LOAD_PARAMS = --curve secp256k1 $(COMMON_LOAD_PARAMS)
# Allow the app to use path 45 for multi-sig (see BIP45).
APP_LOAD_PARAMS += --path "45'"
# Samsung temporary implementation for wallet ID on 0xda7aba5e/0xc1a551c5
APP_LOAD_PARAMS += --path "1517992542'/1101353413'"
##################
# Define Version #
##################
APPVERSION_M = 1
APPVERSION_N = 10
APPVERSION_P = 4
APPVERSION = $(APPVERSION_M).$(APPVERSION_N).$(APPVERSION_P)
APP_LOAD_FLAGS = --appFlags 0xa40 --dep Ethereum:$(APPVERSION)
###########################
# Set Chain environnement #
###########################
########################################
# Mandatory configuration #
########################################
ifeq ($(CHAIN),)
CHAIN = ethereum
# Temporary definition to ensure VSCode extension works... To be cleaned later
APPNAME = Ethereum
endif
SUPPORTED_CHAINS = $(shell find makefile_conf/chain/ -type f -name '*.mk'| sed 's/.*\/\(.*\).mk/\1/g' | sort)
# Check if chain is available
ifeq ($(shell test -s ./makefile_conf/chain/$(CHAIN).mk && echo -n yes), yes)
include ./makefile_conf/chain/$(CHAIN).mk
else
$(error Unsupported CHAIN - use $(SUPPORTED_CHAINS))
ifneq ($(CHAIN),$(filter $(CHAIN),$(SUPPORTED_CHAINS)))
$(error Unsupported CHAIN. Use one of: $(SUPPORTED_CHAINS))
endif
CFLAGS += -DAPPNAME=\"$(APPNAME)\"
DEFINES += CHAINID_COINNAME=\"$(TICKER)\" CHAIN_ID=$(CHAIN_ID)
include ./makefile_conf/chain/$(CHAIN).mk
#########
# Other #
#########
APPVERSION_M = 1
APPVERSION_N = 11
APPVERSION_P = 0
APPVERSION = $(APPVERSION_M).$(APPVERSION_N).$(APPVERSION_P)
APP_LOAD_PARAMS += $(APP_LOAD_FLAGS) --path "44'/1'"
DEFINES += $(DEFINES_LIB)
#prepare hsm generation
ifeq ($(TARGET_NAME),TARGET_NANOS)
ICONNAME = icons/nanos_app_chain_$(CHAIN_ID).gif
else ifeq ($(TARGET_NAME),TARGET_STAX)
ICONNAME = icons/stax_app_chain_$(CHAIN_ID).gif
DEFINES += ICONGLYPH=C_stax_chain_$(CHAIN_ID)_64px
DEFINES += ICONBITMAP=C_stax_chain_$(CHAIN_ID)_64px_bitmap
DEFINES += ICONGLYPH_SMALL=C_stax_chain_$(CHAIN_ID)
GLYPH_FILES += $(ICONNAME)
else
ICONNAME = icons/nanox_app_chain_$(CHAIN_ID).gif
endif
################
# Default rule #
################
all: default
############
# Platform #
############
DEFINES += OS_IO_SEPROXYHAL
DEFINES += HAVE_SPRINTF HAVE_SNPRINTF_FORMAT_U
DEFINES += HAVE_IO_USB HAVE_L4_USBLIB IO_USB_MAX_ENDPOINTS=4 IO_HID_EP_LENGTH=64 HAVE_USB_APDU
DEFINES += LEDGER_MAJOR_VERSION=$(APPVERSION_M) LEDGER_MINOR_VERSION=$(APPVERSION_N) LEDGER_PATCH_VERSION=$(APPVERSION_P)
DEFINES += BUILD_YEAR=\"$(shell date +%Y)\"
# U2F
DEFINES += HAVE_U2F HAVE_IO_U2F
DEFINES += U2F_PROXY_MAGIC=\"w0w\"
DEFINES += USB_SEGMENT_SIZE=64
DEFINES += BLE_SEGMENT_SIZE=32 #max MTU, min 20
DEFINES += APPVERSION=\"$(APPVERSION)\"
#WEBUSB_URL = www.ledgerwallet.com
#DEFINES += HAVE_WEBUSB WEBUSB_URL_SIZE_B=$(shell echo -n $(WEBUSB_URL) | wc -c) WEBUSB_URL=$(shell echo -n $(WEBUSB_URL) | sed -e "s/./\\\'\0\\\',/g")
DEFINES += HAVE_WEBUSB WEBUSB_URL_SIZE_B=0 WEBUSB_URL=""
ifneq (,$(filter $(TARGET_NAME),TARGET_NANOX TARGET_STAX))
DEFINES += HAVE_BLE BLE_COMMAND_TIMEOUT_MS=2000
DEFINES += HAVE_BLE_APDU # basic ledger apdu transport over BLE
SDK_SOURCE_PATH += lib_blewbxx lib_blewbxx_impl
endif
ifeq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += IO_SEPROXYHAL_BUFFER_SIZE_B=128
else
DEFINES += IO_SEPROXYHAL_BUFFER_SIZE_B=300
endif
ifeq ($(TARGET_NAME),TARGET_STAX)
DEFINES += NBGL_QRCODE
SDK_SOURCE_PATH += qrcode
else
DEFINES += HAVE_BAGL
DEFINES += HAVE_UX_FLOW
ifeq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += HAVE_WALLET_ID_SDK
DEFINES += BAGL_WIDTH=128 BAGL_HEIGHT=32
else
DEFINES += HAVE_GLO096
DEFINES += BAGL_WIDTH=128 BAGL_HEIGHT=64
DEFINES += HAVE_BAGL_ELLIPSIS # long label truncation feature
DEFINES += HAVE_BAGL_FONT_OPEN_SANS_REGULAR_11PX
DEFINES += HAVE_BAGL_FONT_OPEN_SANS_EXTRABOLD_11PX
DEFINES += HAVE_BAGL_FONT_OPEN_SANS_LIGHT_16PX
endif
endif
####################
# Enabled Features #
####################
# Enables direct data signing without having to specify it in the settings. Useful when testing with speculos.
ALLOW_DATA ?= 0
ifneq ($(ALLOW_DATA),0)
DEFINES += HAVE_ALLOW_DATA
endif
# Bypass the signature verification for setExternalPlugin, setPlugin, provideERC20TokenInfo and provideNFTInfo calls
BYPASS_SIGNATURES ?= 0
ifneq ($(BYPASS_SIGNATURES),0)
DEFINES += HAVE_BYPASS_SIGNATURES
endif
# Enable the SET_PLUGIN test key
SET_PLUGIN_TEST_KEY ?= 0
ifneq ($(SET_PLUGIN_TEST_KEY),0)
DEFINES += HAVE_SET_PLUGIN_TEST_KEY
endif
# NFTs
ifneq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += HAVE_NFT_SUPPORT
NFT_TEST_KEY ?= 0
ifneq ($(NFT_TEST_KEY),0)
DEFINES += HAVE_NFT_TEST_KEY
endif
NFT_STAGING_KEY ?= 0
ifneq ($(NFT_STAGING_KEY),0)
# Key used by the staging backend
DEFINES += HAVE_NFT_STAGING_KEY
endif
endif
ifneq (,$(filter $(DEFINES),HAVE_NFT_TEST_KEY))
ifneq (, $(filter $(DEFINES),HAVE_NFT_STAGING_KEY))
$(error Multiple alternative NFT keys set at once)
endif
endif
# Dynamic memory allocator
ifneq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += HAVE_DYN_MEM_ALLOC
endif
# EIP-712
ifneq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += HAVE_EIP712_FULL_SUPPORT
endif
# CryptoAssetsList key
CAL_TEST_KEY ?= 0
ifneq ($(CAL_TEST_KEY),0)
# Key used in our test framework
DEFINES += HAVE_CAL_TEST_KEY
endif
CAL_STAGING_KEY ?= 0
ifneq ($(CAL_STAGING_KEY),0)
# Key used by the staging CAL
DEFINES += HAVE_CAL_STAGING_KEY
endif
ifneq (,$(filter $(DEFINES),HAVE_CAL_TEST_KEY))
ifneq (, $(filter $(DEFINES),HAVE_CAL_STAGING_KEY))
# Can't use both the staging and testing keys
$(error Multiple alternative CAL keys set at once)
endif
endif
# ENS
ifneq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += HAVE_DOMAIN_NAME
DOMAIN_NAME_TEST_KEY ?= 0
ifneq ($(DOMAIN_NAME_TEST_KEY),0)
DEFINES += HAVE_DOMAIN_NAME_TEST_KEY
endif
endif
# Enabling debug PRINTF
ifneq ($(DEBUG),0)
DEFINES += HAVE_STACK_OVERFLOW_CHECK
ifeq ($(TARGET_NAME),TARGET_NANOS)
DEFINES += HAVE_PRINTF PRINTF=screen_printf
else
DEFINES += HAVE_PRINTF PRINTF=mcu_usb_printf
endif
else
DEFINES += PRINTF\(...\)=
endif
ifneq ($(NOCONSENT),)
DEFINES += NO_CONSENT
endif
##############
# Compiler #
##############
ifneq ($(BOLOS_ENV),)
$(info BOLOS_ENV=$(BOLOS_ENV))
CLANGPATH := $(BOLOS_ENV)/clang-arm-fropi/bin/
GCCPATH := $(BOLOS_ENV)/gcc-arm-none-eabi-5_3-2016q1/bin/
else
$(info BOLOS_ENV is not set: falling back to CLANGPATH and GCCPATH)
endif
ifeq ($(CLANGPATH),)
$(info CLANGPATH is not set: clang will be used from PATH)
endif
ifeq ($(GCCPATH),)
$(info GCCPATH is not set: arm-none-eabi-* will be used from PATH)
endif
CC := $(CLANGPATH)clang
CFLAGS += -Wno-format-invalid-specifier -Wno-format-extra-args
AS := $(GCCPATH)arm-none-eabi-gcc
LD := $(GCCPATH)arm-none-eabi-gcc
LDLIBS += -lm -lgcc -lc
# import rules to compile glyphs(/pone)
include $(BOLOS_SDK)/Makefile.glyphs
### variables processed by the common makefile.rules of the SDK to grab source files and include dirs
APP_SOURCE_PATH += src_common src src_features src_plugins
SDK_SOURCE_PATH += lib_stusb lib_stusb_impl lib_u2f
ifeq ($(TARGET_NAME),TARGET_STAX)
# Application source files
APP_SOURCE_PATH += src src_features src_plugins
ifeq ($(TARGET_NAME),$(filter $(TARGET_NAME),TARGET_STAX TARGET_FLEX))
APP_SOURCE_PATH += src_nbgl
else
SDK_SOURCE_PATH += lib_ux
APP_SOURCE_PATH += src_bagl
endif
# Allow usage of function from lib_standard_app/crypto_helpers.c
APP_SOURCE_FILES += $(filter-out ./ethereum-plugin-sdk/src/main.c, $(wildcard ./ethereum-plugin-sdk/src/*.c))
INCLUDES_PATH += ./ethereum-plugin-sdk/src
APP_SOURCE_FILES += ${BOLOS_SDK}/lib_standard_app/crypto_helpers.c
APP_SOURCE_FILES += ${BOLOS_SDK}/lib_standard_app/format.c
INCLUDES_PATH += ${BOLOS_SDK}/lib_standard_app
### initialize plugin SDK submodule if needed, rebuild it, and warn if a difference is noticed
ifeq ($(CHAIN),ethereum)
ifneq ($(shell git submodule status | grep '^[-+]'),)
$(info INFO: Need to reinitialize git submodules)
$(shell git submodule update --init)
endif
# rebuild SDK
$(shell ./tools/build_sdk.sh)
# check if a difference is noticed (fail if it happens in CI build)
ifneq ($(shell git status | grep 'ethereum-plugin-sdk'),)
ifneq ($(JENKINS_URL),)
$(error ERROR: please update ethereum-plugin-sdk submodule first)
else
$(warning WARNING: please update ethereum-plugin-sdk submodule first)
endif
endif
endif
load: all
python3 -m ledgerblue.loadApp $(APP_LOAD_PARAMS)
delete:
python3 -m ledgerblue.deleteApp $(COMMON_DELETE_PARAMS)
install_tests:
cd tests/zemu/ && (yarn install || sudo yarn install)
run_tests:
cd tests/zemu/ && (yarn test || sudo yarn test)
test: install_tests run_tests
unit-test:
make -C tests/unit
ifeq ($(TARGET_NAME),TARGET_STAX)
ifeq ($(TARGET_NAME),$(filter $(TARGET_NAME),TARGET_STAX TARGET_FLEX))
NETWORK_ICONS_FILE = $(GEN_SRC_DIR)/net_icons.gen.c
NETWORK_ICONS_DIR = $(shell dirname "$(NETWORK_ICONS_FILE)")
@@ -316,11 +65,103 @@ $(NETWORK_ICONS_FILE):
APP_SOURCE_FILES += $(NETWORK_ICONS_FILE)
endif
# import generic rules from the sdk
include $(BOLOS_SDK)/Makefile.rules
# Application icons following guidelines:
# https://developers.ledger.com/docs/embedded-app/design-requirements/#device-icon
ICON_NANOS = icons/nanos_app_chain_$(CHAIN_ID).gif
ICON_NANOX = icons/nanox_app_chain_$(CHAIN_ID).gif
ICON_NANOSP = icons/nanox_app_chain_$(CHAIN_ID).gif
ICON_STAX = icons/stax_app_chain_$(CHAIN_ID).gif
ICON_FLEX = icons/flex_app_chain_$(CHAIN_ID).gif
#add dependency on custom makefile filename
dep/%.d: %.c Makefile
#prepare hsm generation
ifeq ($(TARGET_NAME),$(filter $(TARGET_NAME),TARGET_STAX TARGET_FLEX))
DEFINES += ICONGLYPH=C_chain_$(CHAIN_ID)_64px
DEFINES += ICONBITMAP=C_chain_$(CHAIN_ID)_64px_bitmap
DEFINES += ICONGLYPH_SMALL=C_chain_$(CHAIN_ID)
endif
listvariants:
@echo VARIANTS CHAIN $(SUPPORTED_CHAINS)
# Application allowed derivation curves.
# Possibles curves are: secp256k1, secp256r1, ed25519 and bls12381g1
# If your app needs it, you can specify multiple curves by using:
# `CURVE_APP_LOAD_PARAMS = <curve1> <curve2>`
CURVE_APP_LOAD_PARAMS += secp256k1
# Application allowed derivation paths.
# You should request a specific path for your app.
# This serve as an isolation mechanism.
# Most application will have to request a path according to the BIP-0044
# and SLIP-0044 standards.
# If your app needs it, you can specify multiple path by using:
# `PATH_APP_LOAD_PARAMS = "44'/1'" "45'/1'"`
PATH_APP_LOAD_PARAMS += "45'" "44'/1'"
# Setting to allow building variant applications
# - <VARIANT_PARAM> is the name of the parameter which should be set
# to specify the variant that should be build.
# - <VARIANT_VALUES> a list of variant that can be build using this app code.
# * It must at least contains one value.
# * Values can be the app ticker or anything else but should be unique.
VARIANT_PARAM = CHAIN
VARIANT_VALUES = $(SUPPORTED_CHAINS)
# Activate dependency only for specific CHAIN
ifneq ($(CHAIN),ethereum)
DEP_APP_LOAD_PARAMS = Ethereum:$(APPVERSION)
DEFINES_LIB = USE_LIB_ETHEREUM
endif
# Enabling DEBUG flag will enable PRINTF and disable optimizations
#DEBUG = 1
########################################
# Application custom permissions #
########################################
# See SDK `include/appflags.h` for the purpose of each permission
#HAVE_APPLICATION_FLAG_DERIVE_MASTER = 1
HAVE_APPLICATION_FLAG_GLOBAL_PIN = 1
HAVE_APPLICATION_FLAG_BOLOS_SETTINGS = 1
HAVE_APPLICATION_FLAG_LIBRARY = 1
########################################
# Application communication interfaces #
########################################
ENABLE_BLUETOOTH = 1
#ENABLE_NFC = 1
########################################
# NBGL custom features #
########################################
ENABLE_NBGL_QRCODE = 1
#ENABLE_NBGL_KEYBOARD = 1
#ENABLE_NBGL_KEYPAD = 1
########################################
# Features disablers #
########################################
# These advanced settings allow to disable some feature that are by
# default enabled in the SDK `Makefile.standard_app`.
DISABLE_STANDARD_APP_FILES = 1
#DISABLE_DEFAULT_IO_SEPROXY_BUFFER_SIZE = 1 # To allow custom size declaration
#DISABLE_STANDARD_APP_DEFINES = 1 # Will set all the following disablers
#DISABLE_STANDARD_SNPRINTF = 1
#DISABLE_STANDARD_USB = 1
#DISABLE_STANDARD_WEBUSB = 1
#DISABLE_STANDARD_BAGL_UX_FLOW = 1
#DISABLE_DEBUG_LEDGER_ASSERT = 1
#DISABLE_DEBUG_THROW = 1
########################################
# Main app configuration #
########################################
DEFINES += CHAINID_COINNAME=\"$(TICKER)\" CHAIN_ID=$(CHAIN_ID)
DEFINES += BUILD_YEAR=\"$(shell date +%Y)\"
# Enabled Features #
include makefile_conf/features.mk
#########################
# Import generic rules from the SDK
include $(BOLOS_SDK)/Makefile.standard_app

286
README.md
View File

@@ -25,24 +25,22 @@
- [About the project](#about-the-project)
- [Documentation](#documentation)
- [Plugins](#plugins)
- [Testing](#testing)
- [Requirements](#requirements)
- [Build the applications required by the test suite](#build-the-applications-required-by-the-test-suite)
- [Running all tests](#running-all-tests)
- [With Makefile](#with-makefile)
- [With yarn](#with-yarn)
- [Running a specific tests](#running-a-specific-tests)
- [Adding tests](#adding-tests)
- [Zemu](#zemu)
- [Update binaries](#update-binaries)
- [Quick start guide](#quick-start-guide)
- [With VSCode](#with-vscode)
- [With a terminal](#with-a-terminal)
- [Compilation and load](#compilation-and-load)
- [Compilation](#compilation)
- [Loading on a physical device](#loading-on-a-physical-device)
- [Tests](#tests)
- [Functional Tests (Ragger based)](#functional-tests-ragger-based)
- [Unit Tests](#unit-tests)
- [Contributing](#contributing)
</details>
## About the project
Ethereum wallet application framework for Nano S, Nano S Plus and Nano X.
Ethereum wallet application framework for Nano S, Nano S Plus and Nano X.
Ledger Blue is not maintained anymore, but the app can still be compiled for this target using the branch [`blue-final-release`](https://github.com/LedgerHQ/app-ethereum/tree/blue-final-release).
## Documentation
@@ -53,101 +51,223 @@ To compile it and load it on a device, please check out our [developer portal](h
### Plugins
We have the concept of plugins in the ETH app.
Find the documentations here:
We have the concept of plugins in the ETH app.
Find the documentations here:
- [Blog Ethereum plugins](https://blog.ledger.com/ethereum-plugins/)
- [Ethereum application Plugins : Technical Specifications](https://github.com/LedgerHQ/app-ethereum/blob/master/doc/ethapp_plugins.asc)
- [Plugin guide](https://hackmd.io/300Ukv5gSbCbVcp3cZuwRQ)
- [Boilerplate plugin](https://github.com/LedgerHQ/app-plugin-boilerplate)
## Testing
## Quick start guide
Testing is done via the open-source framework [zemu](https://github.com/Zondax/zemu).
### With VSCode
### Requirements
You can quickly setup a convenient environment to build and test your application by using
[Ledger's VSCode developer tools extension](https://marketplace.visualstudio.com/items?itemName=LedgerHQ.ledger-dev-tools)
which leverages the [ledger-app-dev-tools](https://github.com/LedgerHQ/ledger-app-builder/pkgs/container/ledger-app-builder%2Fledger-app-dev-tools)
docker image.
- [nodeJS == 16](https://github.com/nvm-sh/nvm)
- [yarn](https://classic.yarnpkg.com/lang/en/docs/install/#debian-stable)
- [build environnement](https://github.com/LedgerHQ/ledger-app-builder/blob/master/Dockerfile)
It will allow you, whether you are developing on macOS, Windows or Linux,
to quickly **build** your apps, **test** them on **Speculos** and **load** them on any supported device.
#### Build the applications required by the test suite
- Install and run [Docker](https://www.docker.com/products/docker-desktop/).
- Make sure you have an X11 server running:
- On Ubuntu Linux, it should be running by default.
- On macOS, install and launch [XQuartz](https://www.xquartz.org/)
(make sure to go to XQuartz > Preferences > Security and check "Allow client connections").
- On Windows, install and launch [VcXsrv](https://sourceforge.net/projects/vcxsrv/)
(make sure to configure it to disable access control).
- Install [VScode](https://code.visualstudio.com/download) and add [Ledger's extension](https://marketplace.visualstudio.com/items?itemName=LedgerHQ.ledger-dev-tools).
- Open a terminal and clone `app-ethereum` with `git clone git@github.com:LedgerHQ/app-ethereum.git`.
- Open the `app-ethereum` folder with VSCode.
- Use Ledger extension's sidebar menu or open the tasks menu with `ctrl + shift + b`
(`command + shift + b` on a Mac) to conveniently execute actions:
- Build the app for the device model of your choice with `Build`.
- Test your binary on [Speculos](https://github.com/LedgerHQ/speculos) with `Run with Speculos`.
- You can also run functional tests, load the app on a physical device, and more.
1. Add your BOLOS SDKs path to:
- `NANOS_SDK` and `NANOX_SDK`
> The terminal tab of VSCode will show you what commands the extension runs behind the scene.
2. Go to the `tests` folder and run `./build_local_test_elfs.sh`
- ```sh
cd tests
# This helper script will build the applications required by the test suite and move them at the right place.
yarn install
./build_local_test_elfs.sh
```
### With a terminal
### Running all tests
#### With Makefile
The [ledger-app-dev-tools](https://github.com/LedgerHQ/ledger-app-builder/pkgs/container/ledger-app-builder%2Fledger-app-dev-tools)
docker image contains all the required tools and libraries to **build**, **test** and **load** an application.
1. Then you can install and run tests by simply running on the `root` of the repo:
- ```sh
make test
```
- This will run `make install_tests` and `make run_tests`
You can download it from the ghcr.io docker repository:
#### With yarn
1. Go to the `tests` folder and run:
- ```sh
yarn test
```
### Running a specific tests
1. Go to the `tests` folder and run:
- ```sh
yarn jest --runInBand --detectOpenHandles {YourTestFile}
```
2. For example with the `send test`:
- ```sh
yarn jest --runInBand --detectOpenHandles src/send.test.js
```
### Adding tests
#### Zemu
To add tests, copy one of the already existing test files in `tests/src/`.
You then need to adapt the `buffer` and `tx` variables to adapt to the APDU you wish to send.
- Adapt the expected screen flow. Please create a folder under `tests/snapshots` with the name of the test you're performing.
- Then adapt the `ORIGINAL_SNAPSHOT_PATH_PREFIX` with the name of the folder you just created.
- To create the snapshots, modify the `SNAPSHOT_PATH_PREFIX` and set it to be equal to `ORIGINAL_SNAPSHOT_PATH_PREFIX`.
- Run the tests once, this will create all the snapshots in the folder you created.
- Put back your `SNAPSHOT_PATH_PREFIX` to `snapshots/tmp/`.
Finally make sure you adapt the expected signature!
#### Update binaries
Don't forget to update the binaries in the test folder. To do so, compile with those environement variables:
```sh
make DEBUG=1 ALLOW_DATA=1
```shell
sudo docker pull ghcr.io/ledgerhq/ledger-app-builder/ledger-app-dev-tools:latest
```
Then copy the binary to the `tests/elfs` folder (in this case, compiled with SDK for nanoS):
You can then enter this development environment by executing the following command
from the root directory of the application `git` repository:
```sh
cp bin/app.elf tests/elfs/ethereum_nanos.elf
#### Linux (Ubuntu)
```shell
sudo docker run --rm -ti --user "$(id -u):$(id -g)" --privileged -v "/dev/bus/usb:/dev/bus/usb" -v "$(realpath .):/app" ghcr.io/ledgerhq/ledger-app-builder/ledger-app-dev-tools:latest
```
Repeat the operation for a binary compiled with nanoX SDK and change for `ethereum_nanox.elf`.
#### macOS
```shell
sudo docker run --rm -ti --user "$(id -u):$(id -g)" --privileged -v "$(pwd -P):/app" ghcr.io/ledgerhq/ledger-app-builder/ledger-app-dev-tools:latest
```
#### Windows (with PowerShell)
```shell
docker run --rm -ti --privileged -v "$(Get-Location):/app" ghcr.io/ledgerhq/ledger-app-builder/ledger-app-dev-tools:latest
```
The application's code will be available from inside the docker container,
you can proceed to the following compilation steps to build your app.
## Compilation and load
To easily setup a development environment for compilation and loading on a physical device, you can use the [VSCode integration](#with-vscode)
whether you are on Linux, macOS or Windows.
If you prefer using a terminal to perform the steps manually, you can use the guide below.
### Compilation
Setup a compilation environment by following the [shell with docker approach](#with-a-terminal).
Be sure you checkout the submodule:
```shell
git submodule update --init
```
From inside the container, use the following command to build the app:
```shell
make DEBUG=1 # compile optionally with PRINTF
```
You can choose which device to compile and load for by setting the `BOLOS_SDK` environment variable to the following values:
- `BOLOS_SDK=$NANOS_SDK`
- `BOLOS_SDK=$NANOX_SDK`
- `BOLOS_SDK=$NANOSP_SDK`
- `BOLOS_SDK=$STAX_SDK`
### Loading on a physical device
This step will vary slightly depending on your platform.
> Your physical device must be connected, unlocked and the screen showing the dashboard (not inside an application).
#### Linux (Ubuntu)
First make sure you have the proper udev rules added on your host.
See [udev-rules](https://github.com/LedgerHQ/udev-rules)
Then once you have [opened a terminal](#with-a-terminal) in the `app-builder` image and [built the app](#compilation-and-load)
for the device you want, run the following command:
```shell
# Run this command from the app-builder container terminal.
make load # load the app on a Nano S by default
```
[Setting the BOLOS_SDK environment variable](#compilation-and-load) will allow you to load
on whichever supported device you want.
#### macOS / Windows (with PowerShell)
> It is assumed you have [Python](https://www.python.org/downloads/) installed on your computer.
Run these commands on your host from the app's source folder once you have [built the app](#compilation-and-load)
for the device you want:
```shell
# Install Python virtualenv
python3 -m pip install virtualenv
# Create the 'ledger' virtualenv
python3 -m virtualenv ledger
```
Enter the Python virtual environment
- macOS: `source ledger/bin/activate`
- Windows: `.\ledger\Scripts\Activate.ps1`
```shell
# Install Ledgerblue (tool to load the app)
python3 -m pip install ledgerblue
# Load the app.
python3 -m ledgerblue.runScript --scp --fileName bin/app.apdu --elfFile bin/app.elf
```
## Tests
The Ethereum app comes with different tests:
- Functional Tests implemented with Ledger's [Ragger](https://github.com/LedgerHQ/ragger) test framework.
- Unit Tests, allowing to test basic simple functions
### Functional Tests (Ragger based)
#### Linux (Ubuntu)
On Linux, you can use [Ledger's VS Code extension](#with-vscode) to run the tests.
If you prefer not to, open a terminal and follow the steps below.
Install the tests requirements:
```shell
pip install -r tests/ragger/requirements.txt
```
Then you can:
Run the functional tests (here for nanos but available for any device once you have built the binaries):
```shell
pytest tests/ragger/ --tb=short -v --device nanos
```
Please see the corresponding ducomentation [USAGE](tests/ragger/usage.md)
Or run your app directly with Speculos
```shell
speculos --model nanos build/nanos/bin/app.elf
```
#### macOS / Windows
To test your app on macOS or Windows, it is recommended to use [Ledger's VS Code extension](#with-vscode)
to quickly setup a working test environment.
You can use the following sequence of tasks and commands (all accessible in the **extension sidebar menu**):
- `Select build target`
- `Build app`
Then you can choose to execute the functional tests:
- Use `Run tests`.
Or simply run the app on the Speculos emulator:
- `Run with Speculos`.
### Unit Tests
Those tests are available in the directory `tests/unit`. Please see the corresponding [README](tests/unit/README.md)
to compile and run them.
## Contributing
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are **greatly appreciated**.
Contributions are what makes the open source community such an amazing place to learn, inspire, and create.
Any contributions you make are **greatly appreciated**.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag `enhancement`.
If you have a suggestion that would make this better, please fork the repo and create a pull request.
You can also simply open an issue with the tag `enhancement`.
1. Fork the Project
2. Create your Feature Branch (`git checkout -b feature/my-feature`)

View File

@@ -5,6 +5,29 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [0.4.1] - 2024-04-15
### Added
- Add new function `send_raw`, allowing to send a raw payload APDU
- Add new error code definition
### Fixed
- Encoding of EIP-712 bytes elements
## [0.4.0] - 2024-04-03
### Added
- Changed `sync` functions of the client to not use an `async` syntax anymore
## [0.3.1] - 2024-03-12
### Fixed
- `recover_transaction` & `recover_message` util functions
## [0.3.0] - 2024-02-13
### Added

View File

@@ -48,4 +48,4 @@ Home = "https://github.com/LedgerHQ/app-ethereum"
ignore_missing_imports = true
[tool.flake8]
max-line-length = 120
max-line-length = 120

View File

@@ -1,4 +1,4 @@
try:
from ledger_app_clients.ethereum.__version__ import __version__ # noqa
from __version__ import __version__ # noqa
except ImportError:
__version__ = "unknown version" # noqa

View File

@@ -21,6 +21,7 @@ class StatusWord(IntEnum):
INVALID_P1_P2 = 0x6b00
CONDITION_NOT_SATISFIED = 0x6985
REF_DATA_NOT_FOUND = 0x6a88
EXCEPTION_OVERFLOW = 0x6807
class DomainNameTag(IntEnum):
@@ -40,14 +41,26 @@ class EthAppClient:
self._client = client
self._cmd_builder = CommandBuilder()
def _send(self, payload: bytes):
def _exchange_async(self, payload: bytes):
return self._client.exchange_async_raw(payload)
def _exchange(self, payload: bytes):
return self._client.exchange_raw(payload)
def response(self) -> Optional[RAPDU]:
return self._client.last_async_response
def send_raw(self, cla: int, ins: int, p1: int, p2: int, payload: bytes):
header = bytearray()
header.append(cla)
header.append(ins)
header.append(p1)
header.append(p2)
header.append(len(payload))
return self._exchange(header + payload)
def eip712_send_struct_def_struct_name(self, name: str):
return self._send(self._cmd_builder.eip712_send_struct_def_struct_name(name))
return self._exchange_async(self._cmd_builder.eip712_send_struct_def_struct_name(name))
def eip712_send_struct_def_struct_field(self,
field_type: EIP712FieldType,
@@ -55,7 +68,7 @@ class EthAppClient:
type_size: int,
array_levels: list,
key_name: str):
return self._send(self._cmd_builder.eip712_send_struct_def_struct_field(
return self._exchange_async(self._cmd_builder.eip712_send_struct_def_struct_field(
field_type,
type_name,
type_size,
@@ -63,37 +76,50 @@ class EthAppClient:
key_name))
def eip712_send_struct_impl_root_struct(self, name: str):
return self._send(self._cmd_builder.eip712_send_struct_impl_root_struct(name))
return self._exchange_async(self._cmd_builder.eip712_send_struct_impl_root_struct(name))
def eip712_send_struct_impl_array(self, size: int):
return self._send(self._cmd_builder.eip712_send_struct_impl_array(size))
return self._exchange_async(self._cmd_builder.eip712_send_struct_impl_array(size))
def eip712_send_struct_impl_struct_field(self, raw_value: bytes):
chunks = self._cmd_builder.eip712_send_struct_impl_struct_field(bytearray(raw_value))
for chunk in chunks[:-1]:
with self._send(chunk):
pass
return self._send(chunks[-1])
self._exchange(chunk)
return self._exchange_async(chunks[-1])
def eip712_sign_new(self, bip32_path: str):
return self._send(self._cmd_builder.eip712_sign_new(bip32_path))
return self._exchange_async(self._cmd_builder.eip712_sign_new(bip32_path))
def eip712_sign_legacy(self,
bip32_path: str,
domain_hash: bytes,
message_hash: bytes):
return self._send(self._cmd_builder.eip712_sign_legacy(bip32_path,
domain_hash,
message_hash))
return self._exchange_async(self._cmd_builder.eip712_sign_legacy(bip32_path,
domain_hash,
message_hash))
def eip712_filtering_activate(self):
return self._send(self._cmd_builder.eip712_filtering_activate())
return self._exchange_async(self._cmd_builder.eip712_filtering_activate())
def eip712_filtering_message_info(self, name: str, filters_count: int, sig: bytes):
return self._send(self._cmd_builder.eip712_filtering_message_info(name, filters_count, sig))
return self._exchange_async(self._cmd_builder.eip712_filtering_message_info(name,
filters_count,
sig))
def eip712_filtering_show_field(self, name: str, sig: bytes):
return self._send(self._cmd_builder.eip712_filtering_show_field(name, sig))
def eip712_filtering_amount_join_token(self, token_idx: int, sig: bytes):
return self._exchange_async(self._cmd_builder.eip712_filtering_amount_join_token(token_idx,
sig))
def eip712_filtering_amount_join_value(self, token_idx: int, name: str, sig: bytes):
return self._exchange_async(self._cmd_builder.eip712_filtering_amount_join_value(token_idx,
name,
sig))
def eip712_filtering_datetime(self, name: str, sig: bytes):
return self._exchange_async(self._cmd_builder.eip712_filtering_datetime(name, sig))
def eip712_filtering_raw(self, name: str, sig: bytes):
return self._exchange_async(self._cmd_builder.eip712_filtering_raw(name, sig))
def sign(self,
bip32_path: str,
@@ -111,24 +137,37 @@ class EthAppClient:
tx = prefix + rlp.encode(decoded + suffix)
chunks = self._cmd_builder.sign(bip32_path, tx, suffix)
for chunk in chunks[:-1]:
with self._send(chunk):
pass
return self._send(chunks[-1])
self._exchange(chunk)
return self._exchange_async(chunks[-1])
def get_challenge(self):
return self._send(self._cmd_builder.get_challenge())
return self._exchange(self._cmd_builder.get_challenge())
def get_public_addr(self,
display: bool = True,
chaincode: bool = False,
bip32_path: str = "m/44'/60'/0'/0/0",
chain_id: Optional[int] = None):
return self._send(self._cmd_builder.get_public_addr(display,
chaincode,
bip32_path,
chain_id))
chain_id: Optional[int] = None) -> RAPDU:
return self._exchange_async(self._cmd_builder.get_public_addr(display,
chaincode,
bip32_path,
chain_id))
def provide_domain_name(self, challenge: int, name: str, addr: bytes):
def get_eth2_public_addr(self,
display: bool = True,
bip32_path: str = "m/12381/3600/0/0"):
return self._exchange_async(self._cmd_builder.get_eth2_public_addr(display,
bip32_path))
def perform_privacy_operation(self,
display: bool = True,
bip32_path: str = "m/44'/60'/0'/0/0",
pubkey: bytes = bytes()):
return self._exchange(self._cmd_builder.perform_privacy_operation(display,
bip32_path,
pubkey))
def provide_domain_name(self, challenge: int, name: str, addr: bytes) -> RAPDU:
payload = format_tlv(DomainNameTag.STRUCTURE_TYPE, 3) # TrustedDomainName
payload += format_tlv(DomainNameTag.STRUCTURE_VERSION, 1)
payload += format_tlv(DomainNameTag.SIGNER_KEY_ID, 0) # test key
@@ -142,9 +181,8 @@ class EthAppClient:
chunks = self._cmd_builder.provide_domain_name(payload)
for chunk in chunks[:-1]:
with self._send(chunk):
pass
return self._send(chunks[-1])
self._exchange(chunk)
return self._exchange(chunks[-1])
def set_plugin(self,
plugin_name: str,
@@ -155,7 +193,7 @@ class EthAppClient:
version: int = 1,
key_id: int = 2,
algo_id: int = 1,
sig: Optional[bytes] = None):
sig: Optional[bytes] = None) -> RAPDU:
if sig is None:
# Temporarily get a command with an empty signature to extract the payload and
# compute the signature on it
@@ -170,15 +208,15 @@ class EthAppClient:
bytes())
# skip APDU header & empty sig
sig = sign_data(Key.SET_PLUGIN, tmp[5:-1])
return self._send(self._cmd_builder.set_plugin(type_,
version,
plugin_name,
contract_addr,
selector,
chain_id,
key_id,
algo_id,
sig))
return self._exchange(self._cmd_builder.set_plugin(type_,
version,
plugin_name,
contract_addr,
selector,
chain_id,
key_id,
algo_id,
sig))
def provide_nft_metadata(self,
collection: str,
@@ -188,7 +226,7 @@ class EthAppClient:
version: int = 1,
key_id: int = 1,
algo_id: int = 1,
sig: Optional[bytes] = None):
sig: Optional[bytes] = None) -> RAPDU:
if sig is None:
# Temporarily get a command with an empty signature to extract the payload and
# compute the signature on it
@@ -202,20 +240,20 @@ class EthAppClient:
bytes())
# skip APDU header & empty sig
sig = sign_data(Key.NFT, tmp[5:-1])
return self._send(self._cmd_builder.provide_nft_information(type_,
version,
collection,
addr,
chain_id,
key_id,
algo_id,
sig))
return self._exchange(self._cmd_builder.provide_nft_information(type_,
version,
collection,
addr,
chain_id,
key_id,
algo_id,
sig))
def set_external_plugin(self,
plugin_name: str,
contract_address: bytes,
method_selelector: bytes,
sig: Optional[bytes] = None):
sig: Optional[bytes] = None) -> RAPDU:
if sig is None:
# Temporarily get a command with an empty signature to extract the payload and
# compute the signature on it
@@ -223,21 +261,23 @@ class EthAppClient:
# skip APDU header & empty sig
sig = sign_data(Key.CAL, tmp[5:])
return self._send(self._cmd_builder.set_external_plugin(plugin_name, contract_address, method_selelector, sig))
return self._exchange(self._cmd_builder.set_external_plugin(plugin_name,
contract_address,
method_selelector,
sig))
def personal_sign(self, path: str, msg: bytes):
chunks = self._cmd_builder.personal_sign(path, msg)
for chunk in chunks[:-1]:
with self._send(chunk):
pass
return self._send(chunks[-1])
self._exchange(chunk)
return self._exchange_async(chunks[-1])
def provide_token_metadata(self,
ticker: str,
addr: bytes,
decimals: int,
chain_id: int,
sig: Optional[bytes] = None):
sig: Optional[bytes] = None) -> RAPDU:
if sig is None:
# Temporarily get a command with an empty signature to extract the payload and
# compute the signature on it
@@ -248,8 +288,8 @@ class EthAppClient:
bytes())
# skip APDU header & empty sig
sig = sign_data(Key.CAL, tmp[6:])
return self._send(self._cmd_builder.provide_erc20_token_information(ticker,
addr,
decimals,
chain_id,
sig))
return self._exchange(self._cmd_builder.provide_erc20_token_information(ticker,
addr,
decimals,
chain_id,
sig))

View File

@@ -11,11 +11,13 @@ from .eip712 import EIP712FieldType
class InsType(IntEnum):
GET_PUBLIC_ADDR = 0x02
GET_ETH2_PUBLIC_ADDR = 0x0e
SIGN = 0x04
PERSONAL_SIGN = 0x08
PROVIDE_ERC20_TOKEN_INFORMATION = 0x0a
PROVIDE_NFT_INFORMATION = 0x14
SET_PLUGIN = 0x16
PERFORM_PRIVACY_OPERATION = 0x18
EIP712_SEND_STRUCT_DEF = 0x1a
EIP712_SEND_STRUCT_IMPL = 0x1c
EIP712_SEND_FILTERING = 0x1e
@@ -39,8 +41,11 @@ class P2Type(IntEnum):
LEGACY_IMPLEM = 0x00
NEW_IMPLEM = 0x01
FILTERING_ACTIVATE = 0x00
FILTERING_CONTRACT_NAME = 0x0f
FILTERING_FIELD_NAME = 0xff
FILTERING_MESSAGE_INFO = 0x0f
FILTERING_DATETIME = 0xfc
FILTERING_TOKEN_ADDR_CHECK = 0xfd
FILTERING_AMOUNT_FIELD = 0xfe
FILTERING_RAW = 0xff
class CommandBuilder:
@@ -60,17 +65,11 @@ class CommandBuilder:
header.append(len(cdata))
return header + cdata
def _string_to_bytes(self, string: str) -> bytes:
data = bytearray()
for char in string:
data.append(ord(char))
return data
def eip712_send_struct_def_struct_name(self, name: str) -> bytes:
return self._serialize(InsType.EIP712_SEND_STRUCT_DEF,
P1Type.COMPLETE_SEND,
P2Type.STRUCT_NAME,
self._string_to_bytes(name))
name.encode())
def eip712_send_struct_def_struct_field(self,
field_type: EIP712FieldType,
@@ -86,7 +85,7 @@ class CommandBuilder:
data.append(typedesc)
if field_type == EIP712FieldType.CUSTOM:
data.append(len(type_name))
data += self._string_to_bytes(type_name)
data += type_name.encode()
if type_size is not None:
data.append(type_size)
if len(array_levels) > 0:
@@ -96,7 +95,7 @@ class CommandBuilder:
if level is not None:
data.append(level)
data.append(len(key_name))
data += self._string_to_bytes(key_name)
data += key_name.encode()
return self._serialize(InsType.EIP712_SEND_STRUCT_DEF,
P1Type.COMPLETE_SEND,
P2Type.STRUCT_FIELD,
@@ -106,7 +105,7 @@ class CommandBuilder:
return self._serialize(InsType.EIP712_SEND_STRUCT_IMPL,
P1Type.COMPLETE_SEND,
P2Type.STRUCT_NAME,
self._string_to_bytes(name))
name.encode())
def eip712_send_struct_impl_array(self, size: int) -> bytes:
data = bytearray()
@@ -160,7 +159,7 @@ class CommandBuilder:
def _eip712_filtering_send_name(self, name: str, sig: bytes) -> bytes:
data = bytearray()
data.append(len(name))
data += self._string_to_bytes(name)
data += name.encode()
data.append(len(sig))
data += sig
return data
@@ -168,25 +167,53 @@ class CommandBuilder:
def eip712_filtering_message_info(self, name: str, filters_count: int, sig: bytes) -> bytes:
data = bytearray()
data.append(len(name))
data += self._string_to_bytes(name)
data += name.encode()
data.append(filters_count)
data.append(len(sig))
data += sig
return self._serialize(InsType.EIP712_SEND_FILTERING,
P1Type.COMPLETE_SEND,
P2Type.FILTERING_CONTRACT_NAME,
P2Type.FILTERING_MESSAGE_INFO,
data)
def eip712_filtering_show_field(self, name: str, sig: bytes) -> bytes:
def eip712_filtering_amount_join_token(self, token_idx: int, sig: bytes) -> bytes:
data = bytearray()
data.append(token_idx)
data.append(len(sig))
data += sig
return self._serialize(InsType.EIP712_SEND_FILTERING,
P1Type.COMPLETE_SEND,
P2Type.FILTERING_FIELD_NAME,
P2Type.FILTERING_TOKEN_ADDR_CHECK,
data)
def eip712_filtering_amount_join_value(self, token_idx: int, name: str, sig: bytes) -> bytes:
data = bytearray()
data.append(len(name))
data += name.encode()
data.append(token_idx)
data.append(len(sig))
data += sig
return self._serialize(InsType.EIP712_SEND_FILTERING,
P1Type.COMPLETE_SEND,
P2Type.FILTERING_AMOUNT_FIELD,
data)
def eip712_filtering_datetime(self, name: str, sig: bytes) -> bytes:
return self._serialize(InsType.EIP712_SEND_FILTERING,
P1Type.COMPLETE_SEND,
P2Type.FILTERING_DATETIME,
self._eip712_filtering_send_name(name, sig))
def eip712_filtering_raw(self, name: str, sig: bytes) -> bytes:
return self._serialize(InsType.EIP712_SEND_FILTERING,
P1Type.COMPLETE_SEND,
P2Type.FILTERING_RAW,
self._eip712_filtering_send_name(name, sig))
def set_external_plugin(self, plugin_name: str, contract_address: bytes, selector: bytes, sig: bytes) -> bytes:
data = bytearray()
data.append(len(plugin_name))
data += self._string_to_bytes(plugin_name)
data += plugin_name.encode()
data += contract_address
data += selector
data += sig
@@ -250,6 +277,25 @@ class CommandBuilder:
int(chaincode),
payload)
def get_eth2_public_addr(self,
display: bool,
bip32_path: str) -> bytes:
payload = pack_derivation_path(bip32_path)
return self._serialize(InsType.GET_ETH2_PUBLIC_ADDR,
int(display),
0x00,
payload)
def perform_privacy_operation(self,
display: bool,
bip32_path: str,
pubkey: bytes) -> bytes:
payload = pack_derivation_path(bip32_path)
return self._serialize(InsType.PERFORM_PRIVACY_OPERATION,
int(display),
0x01 if pubkey else 0x00,
payload + pubkey)
def set_plugin(self,
type_: int,
version: int,

View File

@@ -4,11 +4,13 @@ import re
import signal
import sys
import copy
from typing import Any, Callable, Optional
from typing import Any, Callable, Optional, Union
import struct
from ledger_app_clients.ethereum import keychain
from ledger_app_clients.ethereum.client import EthAppClient, EIP712FieldType
from client import keychain
from client.client import EthAppClient, EIP712FieldType
from ragger.firmware import Firmware
# global variables
app_client: EthAppClient = None
@@ -22,6 +24,7 @@ def default_handler():
autonext_handler: Callable = default_handler
is_golden_run: bool
# From a string typename, extract the type and all the array depth
@@ -118,69 +121,60 @@ def send_struct_def_field(typename, keyname):
return (typename, type_enum, typesize, array_lvls)
def encode_integer(value, typesize):
data = bytearray()
def encode_integer(value: Union[str, int], typesize: int) -> bytes:
# Some are already represented as integers in the JSON, but most as strings
if isinstance(value, str):
base = 10
if value.startswith("0x"):
base = 16
value = int(value, base)
value = int(value, 0)
if value == 0:
data.append(0)
data = b'\x00'
else:
if value < 0: # negative number, send it as unsigned
mask = 0
for i in range(typesize): # make a mask as big as the typesize
mask = (mask << 8) | 0xff
value &= mask
while value > 0:
data.append(value & 0xff)
value >>= 8
data.reverse()
# biggest uint type accepted by struct.pack
uint64_mask = 0xffffffffffffffff
data = struct.pack(">QQQQ",
(value >> 192) & uint64_mask,
(value >> 128) & uint64_mask,
(value >> 64) & uint64_mask,
value & uint64_mask)
data = data[len(data) - typesize:]
data = data.lstrip(b'\x00')
return data
def encode_int(value, typesize):
def encode_int(value: str, typesize: int) -> bytes:
return encode_integer(value, typesize)
def encode_uint(value, typesize):
def encode_uint(value: str, typesize: int) -> bytes:
return encode_integer(value, typesize)
def encode_hex_string(value, size):
data = bytearray()
value = value[2:] # skip 0x
byte_idx = 0
while byte_idx < size:
data.append(int(value[(byte_idx * 2):(byte_idx * 2 + 2)], 16))
byte_idx += 1
return data
def encode_hex_string(value: str, size: int) -> bytes:
assert value.startswith("0x")
value = value[2:]
if len(value) < (size * 2):
value = value.rjust(size * 2, "0")
assert len(value) == (size * 2)
return bytes.fromhex(value)
def encode_address(value, typesize):
def encode_address(value: str, typesize: int) -> bytes:
return encode_hex_string(value, 20)
def encode_bool(value, typesize):
return encode_integer(value, typesize)
def encode_bool(value: str, typesize: int) -> bytes:
return encode_integer(value, 1)
def encode_string(value, typesize):
data = bytearray()
for char in value:
data.append(ord(char))
return data
def encode_string(value: str, typesize: int) -> bytes:
return value.encode()
def encode_bytes_fix(value, typesize):
def encode_bytes_fix(value: str, typesize: int) -> bytes:
return encode_hex_string(value, typesize)
def encode_bytes_dyn(value, typesize):
def encode_bytes_dyn(value: str, typesize: int) -> bytes:
# length of the value string
# - the length of 0x (2)
# / by the length of one byte in a hex string (2)
@@ -208,7 +202,22 @@ def send_struct_impl_field(value, field):
if filtering_paths:
path = ".".join(current_path)
if path in filtering_paths.keys():
send_filtering_show_field(filtering_paths[path])
if filtering_paths[path]["type"] == "amount_join_token":
send_filtering_amount_join_token(filtering_paths[path]["token"])
elif filtering_paths[path]["type"] == "amount_join_value":
if "token" in filtering_paths[path].keys():
token = filtering_paths[path]["token"]
else:
# Permit (ERC-2612)
token = 0xff
send_filtering_amount_join_value(token,
filtering_paths[path]["name"])
elif filtering_paths[path]["type"] == "datetime":
send_filtering_datetime(filtering_paths[path]["name"])
elif filtering_paths[path]["type"] == "raw":
send_filtering_raw(filtering_paths[path]["name"])
else:
assert False
with app_client.eip712_send_struct_impl_struct_field(data):
enable_autonext()
@@ -259,18 +268,24 @@ def send_struct_impl(structs, data, structname):
return True
def start_signature_payload(ctx: dict, magic: int) -> bytearray:
to_sign = bytearray()
# magic number so that signature for one type of filter can't possibly be
# valid for another, defined in APDU specs
to_sign.append(magic)
to_sign += ctx["chainid"]
to_sign += ctx["caddr"]
to_sign += ctx["schema_hash"]
return to_sign
# ledgerjs doesn't actually sign anything, and instead uses already pre-computed signatures
def send_filtering_message_info(display_name: str, filters_count: int):
global sig_ctx
to_sign = bytearray()
to_sign.append(183)
to_sign += sig_ctx["chainid"]
to_sign += sig_ctx["caddr"]
to_sign += sig_ctx["schema_hash"]
to_sign = start_signature_payload(sig_ctx, 183)
to_sign.append(filters_count)
for char in display_name:
to_sign.append(ord(char))
to_sign += display_name.encode()
sig = keychain.sign_data(keychain.Key.CAL, to_sign)
with app_client.eip712_filtering_message_info(display_name, filters_count, sig):
@@ -278,23 +293,57 @@ def send_filtering_message_info(display_name: str, filters_count: int):
disable_autonext()
# ledgerjs doesn't actually sign anything, and instead uses already pre-computed signatures
def send_filtering_show_field(display_name):
def send_filtering_amount_join_token(token_idx: int):
global sig_ctx
path_str = ".".join(current_path)
to_sign = bytearray()
to_sign.append(72)
to_sign += sig_ctx["chainid"]
to_sign += sig_ctx["caddr"]
to_sign += sig_ctx["schema_hash"]
for char in path_str:
to_sign.append(ord(char))
for char in display_name:
to_sign.append(ord(char))
to_sign = start_signature_payload(sig_ctx, 11)
to_sign += path_str.encode()
to_sign.append(token_idx)
sig = keychain.sign_data(keychain.Key.CAL, to_sign)
with app_client.eip712_filtering_show_field(display_name, sig):
with app_client.eip712_filtering_amount_join_token(token_idx, sig):
pass
def send_filtering_amount_join_value(token_idx: int, display_name: str):
global sig_ctx
path_str = ".".join(current_path)
to_sign = start_signature_payload(sig_ctx, 22)
to_sign += path_str.encode()
to_sign += display_name.encode()
to_sign.append(token_idx)
sig = keychain.sign_data(keychain.Key.CAL, to_sign)
with app_client.eip712_filtering_amount_join_value(token_idx, display_name, sig):
pass
def send_filtering_datetime(display_name: str):
global sig_ctx
path_str = ".".join(current_path)
to_sign = start_signature_payload(sig_ctx, 33)
to_sign += path_str.encode()
to_sign += display_name.encode()
sig = keychain.sign_data(keychain.Key.CAL, to_sign)
with app_client.eip712_filtering_datetime(display_name, sig):
pass
# ledgerjs doesn't actually sign anything, and instead uses already pre-computed signatures
def send_filtering_raw(display_name):
global sig_ctx
path_str = ".".join(current_path)
to_sign = start_signature_payload(sig_ctx, 72)
to_sign += path_str.encode()
to_sign += display_name.encode()
sig = keychain.sign_data(keychain.Key.CAL, to_sign)
with app_client.eip712_filtering_raw(display_name, sig):
pass
@@ -305,6 +354,12 @@ def prepare_filtering(filtr_data, message):
filtering_paths = filtr_data["fields"]
else:
filtering_paths = {}
if "tokens" in filtr_data:
for token in filtr_data["tokens"]:
app_client.provide_token_metadata(token["ticker"],
bytes.fromhex(token["addr"][2:]),
token["decimals"],
token["chain_id"])
def handle_optional_domain_values(domain):
@@ -337,10 +392,16 @@ def next_timeout(_signum: int, _frame):
def enable_autonext():
if app_client._client.firmware.device == 'stax': # Stax Speculos is slow
delay = 1.5
if app_client._client.firmware in (Firmware.STAX, Firmware.FLEX):
delay = 1/3
else:
delay = 1/4
# golden run has to be slower to make sure we take good snapshots
# and not processing/loading screens
if is_golden_run:
delay *= 3
signal.setitimer(signal.ITIMER_REAL, delay, delay)
@@ -351,10 +412,12 @@ def disable_autonext():
def process_data(aclient: EthAppClient,
data_json: dict,
filters: Optional[dict] = None,
autonext: Optional[Callable] = None) -> bool:
autonext: Optional[Callable] = None,
golden_run: bool = False) -> bool:
global sig_ctx
global app_client
global autonext_handler
global is_golden_run
# deepcopy because this function modifies the dict
data_json = copy.deepcopy(data_json)
@@ -369,6 +432,8 @@ def process_data(aclient: EthAppClient,
autonext_handler = autonext
signal.signal(signal.SIGALRM, next_timeout)
is_golden_run = golden_run
if filters:
init_signature_context(types, domain)

View File

@@ -5,49 +5,55 @@ from typing import Union
class SettingID(Enum):
BLIND_SIGNING = auto()
DEBUG_DATA = auto()
NONCE = auto()
VERBOSE_EIP712 = auto()
VERBOSE_ENS = auto()
VERBOSE_EIP712 = auto()
NONCE = auto()
DEBUG_DATA = auto()
def get_device_settings(device: str) -> list[SettingID]:
if device == "nanos":
def get_device_settings(firmware: Firmware) -> list[SettingID]:
if firmware == Firmware.NANOS:
return [
SettingID.BLIND_SIGNING,
SettingID.DEBUG_DATA,
SettingID.NONCE
]
if (device == "nanox") or (device == "nanosp") or (device == "stax"):
return [
SettingID.BLIND_SIGNING,
SettingID.DEBUG_DATA,
SettingID.NONCE,
SettingID.VERBOSE_EIP712,
SettingID.VERBOSE_ENS
SettingID.DEBUG_DATA,
]
return []
return [
SettingID.VERBOSE_ENS,
SettingID.VERBOSE_EIP712,
SettingID.NONCE,
SettingID.DEBUG_DATA,
]
settings_per_page = 3
def get_setting_per_page(firmware: Firmware) -> int:
if firmware == Firmware.STAX:
return 3
return 2
def get_setting_position(device: str, setting: Union[NavInsID, SettingID]) -> tuple[int, int]:
screen_height = 672 # px
header_height = 85 # px
footer_height = 124 # px
def get_setting_position(firmware: Firmware, setting: Union[NavInsID, SettingID]) -> tuple[int, int]:
settings_per_page = get_setting_per_page(firmware)
if firmware == Firmware.STAX:
screen_height = 672 # px
header_height = 88 # px
footer_height = 92 # px
option_offset = 350 # px
else:
screen_height = 600 # px
header_height = 92 # px
footer_height = 97 # px
option_offset = 420 # px
usable_height = screen_height - (header_height + footer_height)
setting_height = usable_height // settings_per_page
index_in_page = get_device_settings(device).index(SettingID(setting)) % settings_per_page
return 350, header_height + (setting_height * index_in_page) + (setting_height // 2)
index_in_page = get_device_settings(firmware).index(SettingID(setting)) % settings_per_page
return option_offset, header_height + (setting_height * index_in_page) + (setting_height // 2)
def settings_toggle(fw: Firmware, nav: Navigator, to_toggle: list[SettingID]):
def settings_toggle(firmware: Firmware, nav: Navigator, to_toggle: list[SettingID]):
moves: list[Union[NavIns, NavInsID]] = list()
settings = get_device_settings(fw.device)
settings = get_device_settings(firmware)
# Assume the app is on the home page
if fw.device.startswith("nano"):
if firmware.is_nano:
moves += [NavInsID.RIGHT_CLICK] * 2
moves += [NavInsID.BOTH_CLICK]
for setting in settings:
@@ -57,12 +63,12 @@ def settings_toggle(fw: Firmware, nav: Navigator, to_toggle: list[SettingID]):
moves += [NavInsID.BOTH_CLICK] # Back
else:
moves += [NavInsID.USE_CASE_HOME_SETTINGS]
moves += [NavInsID.USE_CASE_SETTINGS_NEXT]
settings_per_page = get_setting_per_page(firmware)
for setting in settings:
setting_idx = settings.index(setting)
if (setting_idx > 0) and (setting_idx % settings_per_page) == 0:
moves += [NavInsID.USE_CASE_SETTINGS_NEXT]
if setting in to_toggle:
moves += [NavIns(NavInsID.TOUCH, get_setting_position(fw.device, setting))]
moves += [NavIns(NavInsID.TOUCH, get_setting_position(firmware, setting))]
moves += [NavInsID.USE_CASE_SETTINGS_MULTI_PAGE_EXIT]
nav.navigate(moves, screen_change_before_first_instruction=False)

View File

@@ -3,6 +3,14 @@ from eth_account.messages import encode_defunct, encode_typed_data
import rlp
# eth_account requires it for some reason
def normalize_vrs(vrs: tuple) -> tuple:
vrs_l = list()
for elem in vrs:
vrs_l.append(elem.lstrip(b'\x00'))
return tuple(vrs_l)
def get_selector_from_data(data: str) -> bytes:
raw_data = bytes.fromhex(data[2:])
return raw_data[:4]
@@ -13,7 +21,7 @@ def recover_message(msg, vrs: tuple) -> bytes:
smsg = encode_typed_data(full_message=msg)
else: # EIP-191
smsg = encode_defunct(primitive=msg)
addr = Account.recover_message(smsg, vrs)
addr = Account.recover_message(smsg, normalize_vrs(vrs))
return bytes.fromhex(addr[2:])
@@ -23,17 +31,33 @@ def recover_transaction(tx_params, vrs: tuple) -> bytes:
if raw_tx[0] in [0x01, 0x02]:
prefix = raw_tx[:1]
raw_tx = raw_tx[len(prefix):]
# v is returned on one byte only so it might have overflowed
# in that case, we will reconstruct it to its full value
if "chainId" in tx_params:
trunc_chain_id = tx_params["chainId"]
while trunc_chain_id.bit_length() > 32:
trunc_chain_id >>= 8
target = tx_params["chainId"] * 2 + 35
trunc_target = trunc_chain_id * 2 + 35
diff = vrs[0][0] - (trunc_target & 0xff)
vrs = (target + diff, vrs[1], vrs[2])
else:
if "chainId" in tx_params:
# v is returned on one byte only so it might have overflowed
# in that case, we will reconstruct it to its full value
trunc_chain_id = tx_params["chainId"]
while trunc_chain_id.bit_length() > 32:
trunc_chain_id >>= 8
trunc_target = trunc_chain_id * 2 + 35
trunc_v = int.from_bytes(vrs[0], "big")
if (trunc_target & 0xff) == trunc_v:
parity = 0
elif ((trunc_target + 1) & 0xff) == trunc_v:
parity = 1
else:
# should have matched with a previous if
assert False
# https://github.com/ethereum/EIPs/blob/master/EIPS/eip-155.md
full_v = parity + tx_params["chainId"] * 2 + 35
# 9 bytes would be big enough even for the biggest chain ID
vrs = (int(full_v).to_bytes(9, "big"), vrs[1], vrs[2])
else:
# Pre EIP-155 TX
assert False
decoded = rlp.decode(raw_tx)
reencoded = rlp.encode(decoded[:-3] + list(vrs))
reencoded = rlp.encode(decoded[:-3] + list(normalize_vrs(vrs)))
addr = Account.recover_transaction(prefix + reencoded)
return bytes.fromhex(addr[2:])

View File

@@ -12,7 +12,7 @@ This document described how a specific device UI for a smart contract can be add
## Standard support
The applications already includes dedicated UI support for those specific contract calls :
The application already includes dedicated UI support for those specific contract calls :
* ERC 20 approve(address, uint256) - implementation in *src_features/erc20_approval*
* ERC 20 transfer(address, uint256) - implementation in *src_features/signTx*
@@ -45,4 +45,3 @@ A UI implementation might want to convert an ERC 20 token contract address to a
2 tickers can be temporarily provisioned to the application by using the PROVIDE ERC 20 TOKEN INFORMATION APDU, described in *src_features/provideErc20TokenInformation* - the UI can then iterate on the provisioned tickers to display relevant information to the user
The same mechanism will be extended to support well known contract addresses in the future

View File

@@ -41,6 +41,10 @@ Application version 1.9.19 - 2022-05-17
### 1.10.2
- Add domain names support
### 1.11.0
- Add EIP-712 amount & date/time filtering
- PROVIDE ERC 20 TOKEN INFORMATION & PROVIDE NFT INFORMATION now send back the index where the asset has been stored
## About
This application describes the APDU messages interface to communicate with the Ethereum application.
@@ -267,34 +271,38 @@ The signature is computed on
ticker || address || number of decimals (uint4be) || chainId (uint4be)
signed by the following secp256k1 public key 0482bbf2f34f367b2e5bc21847b6566f21f0976b22d3388a9a5e446ac62d25cf725b62a2555b2dd464a4da0ab2f4d506820543af1d242470b1b1a969a27578f353
signed by the following secp256k1 public key 045e6c1020c14dc46442fe89f97c0b68cdb15976dc24f24c316e7b30fe4e8cc76b1489150c21514ebf440ff5dea5393d83de5358cd098fce8fd0f81daa94979183
#### Coding
'Command'
[width="80%"]
|==============================================================================================================================
|======================================================================
| *CLA* | *INS* | *P1* | *P2* | *Lc* | *Le*
| E0 | 0A | 00 | 00 | variable | 00
|==============================================================================================================================
| E0 | 0A | 00 | 00 | variable | 00
|======================================================================
'Input data'
[width="80%"]
|==============================================================================================================================
| *Description* | *Length*
| Length of ERC 20 ticker | 1
| ERC 20 ticker | variable
| ERC 20 contract address | 20
| Number of decimals (big endian encoded) | 4
| Chain ID (big endian encoded) | 4
| Token information signature | variable
|==============================================================================================================================
|=======================================================================
| *Description* | *Length*
| Length of ERC 20 ticker | 1
| ERC 20 ticker | variable
| ERC 20 contract address | 20
| Number of decimals (big endian encoded) | 4
| Chain ID (big endian encoded) | 4
| Token information signature | variable
|=======================================================================
'Output data'
None
[width="80%"]
|====================================================================
| *Description* | *Length*
| Asset index where the information has been stored | 1
|====================================================================
### SIGN ETH EIP 712
@@ -509,7 +517,11 @@ type || version || len(collectionName) || collectionName || address || chainId |
'Output data'
None
[width="80%"]
|====================================================================
| *Description* | *Length*
| Asset index where the information has been stored | 1
|====================================================================
### SET PLUGIN
@@ -801,7 +813,7 @@ None
This command provides a trusted way of deciding what information from the JSON data to show and replace some values by more meaningful ones.
This mode can be overriden by the in-app setting to fully clear-sign EIP-712 messages.
This mode can be overridden by the in-app setting to fully clear-sign EIP-712 messages.
For the signatures :
@@ -825,10 +837,36 @@ The signature is computed on :
183 || chain ID (BE) || contract address || schema hash || filters count || display name
##### Amount-join token
##### Show field
This command should come before the corresponding *SEND STRUCT IMPLEMENTATION* and are only usable for message fields (and not domain ones).
The first byte is used so that a signature of one type cannot be valid as another type.
These commands should come before the corresponding *SEND STRUCT IMPLEMENTATION* and are only usable for message fields (and not domain ones).
The signature is computed on :
11 || chain ID (BE) || contract address || schema hash || field path || token index
##### Amount-join value
This command should come before the corresponding *SEND STRUCT IMPLEMENTATION* and are only usable for message fields (and not domain ones).
A token index of 0xFF indicates the token address is in the _verifyingContract_ field of the EIP712Domain so the app won't receive an amount-join token filtering APDU. This enables support for Permit (ERC-2612) messages.
The signature is computed on :
22 || chain ID (BE) || contract address || schema hash || field path || display name || token index
##### Date / Time
This command should come before the corresponding *SEND STRUCT IMPLEMENTATION* and are only usable for message fields (and not domain ones).
The signature is computed on :
33 || chain ID (BE) || contract address || schema hash || field path || display name
##### Show raw field
This command should come before the corresponding *SEND STRUCT IMPLEMENTATION* and are only usable for message fields (and not domain ones).
The first byte is used so that a signature of one type cannot be valid as another type.
The signature is computed on :
@@ -843,17 +881,23 @@ _Command_
|=========================================================================
| *CLA* | *INS* | *P1* | *P2* | *LC* | *Le*
| E0 | 1E | 00
| 00 : activate
| 00 : activation
0F : message info
FF : show field
FC : date/time
FD : amount-join token
FE : amount-join value
FF : raw field
| variable | variable
|=========================================================================
_Input data_
##### If P2 == activate
##### If P2 == activation
None
@@ -869,7 +913,40 @@ None
| Signature | variable
|==========================================
##### If P2 == show field
##### If P2 == date / time
[width="80%"]
|==========================================
| *Description* | *Length (byte)*
| Display name length | 1
| Display name | variable
| Signature length | 1
| Signature | variable
|==========================================
##### If P2 == amount-join token
[width="80%"]
|==========================================
| *Description* | *Length (byte)*
| Token index | 1
| Signature length | 1
| Signature | variable
|==========================================
##### If P2 == amount-join value
[width="80%"]
|==========================================
| *Description* | *Length (byte)*
| Display name length | 1
| Display name | variable
| Token index | 1
| Signature length | 1
| Signature | variable
|==========================================
##### If P2 == show raw field
[width="80%"]
|==========================================

View File

@@ -85,8 +85,6 @@ typedef struct ethPluginInitContract_t {
uint8_t *selector; // 4 bytes selector
uint32_t dataSize;
char *alias; // 29 bytes alias if ETH_PLUGIN_RESULT_OK_ALIAS set
uint8_t result;
} ethPluginInitContract_t;
@@ -102,7 +100,6 @@ This message is sent when the selector of the data has been parsed. The followin
The following return codes are expected, any other will abort the signing process :
* ETH_PLUGIN_RESULT_OK : if the plugin can be successfully initialized
* ETH_PLUGIN_RESULT_OK_ALIAS : if a base64 encoded alias of another plugin to call is copied to the _alias_ field. In this case, the dispatcher will follow the alias chain, and the original plugin will only be called to retrieve its name when using a generic user interface
* ETH_PLUGIN_RESULT_FALLBACK : if the signing logic should fallback to the generic one
### ETH_PLUGIN_PROVIDE_PARAMETER

View File

@@ -59,7 +59,7 @@ if args.path == None:
domainHash = binascii.unhexlify(args.domainHash)
messageHash = binascii.unhexlify(args.messageHash)
encodedTx = domainHash + messageHash
encodedTx = domainHash + messageHash
donglePath = parse_bip32_path(args.path)
apdu = bytearray.fromhex("e00c0000")

View File

Before

Width:  |  Height:  |  Size: 734 B

After

Width:  |  Height:  |  Size: 734 B

1
glyphs/chain_10200_64px.gif Symbolic link
View File

@@ -0,0 +1 @@
chain_100_64px.gif

View File

Before

Width:  |  Height:  |  Size: 641 B

After

Width:  |  Height:  |  Size: 641 B

BIN
glyphs/chain_10507_64px.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 748 B

View File

Before

Width:  |  Height:  |  Size: 444 B

After

Width:  |  Height:  |  Size: 444 B

View File

Before

Width:  |  Height:  |  Size: 720 B

After

Width:  |  Height:  |  Size: 720 B

View File

Before

Width:  |  Height:  |  Size: 287 B

After

Width:  |  Height:  |  Size: 287 B

View File

Before

Width:  |  Height:  |  Size: 654 B

After

Width:  |  Height:  |  Size: 654 B

1
glyphs/chain_1101_64px.gif Symbolic link
View File

@@ -0,0 +1 @@
chain_137_64px.gif

View File

@@ -0,0 +1 @@
chain_1_64px.gif

View File

@@ -0,0 +1 @@
chain_10_64px.gif

View File

Before

Width:  |  Height:  |  Size: 706 B

After

Width:  |  Height:  |  Size: 706 B

View File

Before

Width:  |  Height:  |  Size: 448 B

After

Width:  |  Height:  |  Size: 448 B

BIN
glyphs/chain_1135_64px.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 541 B

View File

Before

Width:  |  Height:  |  Size: 212 B

After

Width:  |  Height:  |  Size: 212 B

View File

Before

Width:  |  Height:  |  Size: 294 B

After

Width:  |  Height:  |  Size: 294 B

View File

Before

Width:  |  Height:  |  Size: 622 B

After

Width:  |  Height:  |  Size: 622 B

View File

Before

Width:  |  Height:  |  Size: 681 B

After

Width:  |  Height:  |  Size: 681 B

View File

Before

Width:  |  Height:  |  Size: 199 B

After

Width:  |  Height:  |  Size: 199 B

View File

Before

Width:  |  Height:  |  Size: 584 B

After

Width:  |  Height:  |  Size: 584 B

View File

@@ -0,0 +1 @@
chain_81457_64px.gif

View File

Before

Width:  |  Height:  |  Size: 210 B

After

Width:  |  Height:  |  Size: 210 B

1
glyphs/chain_17000_64px.gif Symbolic link
View File

@@ -0,0 +1 @@
chain_1_64px.gif

View File

Before

Width:  |  Height:  |  Size: 966 B

After

Width:  |  Height:  |  Size: 966 B

BIN
glyphs/chain_1907_64px.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 825 B

View File

Before

Width:  |  Height:  |  Size: 271 B

After

Width:  |  Height:  |  Size: 271 B

View File

Before

Width:  |  Height:  |  Size: 649 B

After

Width:  |  Height:  |  Size: 649 B

View File

Before

Width:  |  Height:  |  Size: 1.0 KiB

After

Width:  |  Height:  |  Size: 1.0 KiB

View File

Before

Width:  |  Height:  |  Size: 251 B

After

Width:  |  Height:  |  Size: 251 B

View File

Before

Width:  |  Height:  |  Size: 569 B

After

Width:  |  Height:  |  Size: 569 B

View File

Before

Width:  |  Height:  |  Size: 886 B

After

Width:  |  Height:  |  Size: 886 B

View File

@@ -0,0 +1 @@
chain_20531812_64px.gif

View File

Before

Width:  |  Height:  |  Size: 389 B

After

Width:  |  Height:  |  Size: 389 B

View File

Before

Width:  |  Height:  |  Size: 470 B

After

Width:  |  Height:  |  Size: 470 B

View File

Before

Width:  |  Height:  |  Size: 914 B

After

Width:  |  Height:  |  Size: 914 B

View File

Before

Width:  |  Height:  |  Size: 217 B

After

Width:  |  Height:  |  Size: 217 B

View File

@@ -0,0 +1 @@
chain_245022934_64px.gif

View File

Before

Width:  |  Height:  |  Size: 587 B

After

Width:  |  Height:  |  Size: 587 B

View File

Before

Width:  |  Height:  |  Size: 997 B

After

Width:  |  Height:  |  Size: 997 B

View File

Before

Width:  |  Height:  |  Size: 985 B

After

Width:  |  Height:  |  Size: 985 B

View File

Before

Width:  |  Height:  |  Size: 836 B

After

Width:  |  Height:  |  Size: 836 B

View File

Before

Width:  |  Height:  |  Size: 340 B

After

Width:  |  Height:  |  Size: 340 B

View File

Before

Width:  |  Height:  |  Size: 310 B

After

Width:  |  Height:  |  Size: 310 B

View File

Before

Width:  |  Height:  |  Size: 661 B

After

Width:  |  Height:  |  Size: 661 B

View File

Before

Width:  |  Height:  |  Size: 610 B

After

Width:  |  Height:  |  Size: 610 B

View File

Before

Width:  |  Height:  |  Size: 354 B

After

Width:  |  Height:  |  Size: 354 B

View File

Before

Width:  |  Height:  |  Size: 504 B

After

Width:  |  Height:  |  Size: 504 B

View File

Before

Width:  |  Height:  |  Size: 314 B

After

Width:  |  Height:  |  Size: 314 B

View File

Before

Width:  |  Height:  |  Size: 384 B

After

Width:  |  Height:  |  Size: 384 B

1
glyphs/chain_300_64px.gif Symbolic link
View File

@@ -0,0 +1 @@
chain_324_64px.gif

View File

Before

Width:  |  Height:  |  Size: 295 B

After

Width:  |  Height:  |  Size: 295 B

View File

Before

Width:  |  Height:  |  Size: 265 B

After

Width:  |  Height:  |  Size: 265 B

View File

Before

Width:  |  Height:  |  Size: 293 B

After

Width:  |  Height:  |  Size: 293 B

View File

Before

Width:  |  Height:  |  Size: 295 B

After

Width:  |  Height:  |  Size: 295 B

View File

Before

Width:  |  Height:  |  Size: 835 B

After

Width:  |  Height:  |  Size: 835 B

BIN
glyphs/chain_324_64px.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 355 B

View File

Before

Width:  |  Height:  |  Size: 342 B

After

Width:  |  Height:  |  Size: 342 B

View File

Before

Width:  |  Height:  |  Size: 596 B

After

Width:  |  Height:  |  Size: 596 B

BIN
glyphs/chain_3776_64px.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 880 B

View File

Before

Width:  |  Height:  |  Size: 421 B

After

Width:  |  Height:  |  Size: 421 B

1
glyphs/chain_3_64px.gif Symbolic link
View File

@@ -0,0 +1 @@
chain_1_64px.gif

View File

Before

Width:  |  Height:  |  Size: 228 B

After

Width:  |  Height:  |  Size: 228 B

Some files were not shown because too many files have changed in this diff Show More