当前位置: 首页 > 软件库 > Web3 > 开源货币/比特币 >

spark-wallet

授权协议 MIT License
开发语言 Python
所属分类 Web3、 开源货币/比特币
软件类型 开源软件
地区 不详
投 递 者 车嘉实
操作系统 跨平台
开源组织
适用人群 未知
 软件概览

Spark Lightning Wallet

Simple & minimalistic Purely off-chain Progressive Web App Personalizable themes Mobile and desktop apps Automatic self-signed TLS LetsEncrypt integration Tor hidden service (v3)

Contents

Introduction

Spark is a minimalistic wallet GUI for c-lightning, accessible over the web orthrough mobile and desktop apps (for Android, Linux, macOS and Windows).It is currently oriented for technically advanced users and is not an all-in-one package,but rather a "remote control" interface for a c-lightning node that has to be managed separately.

Sparks supports sending and receiving payments, viewing history, and managing channels.

Spark is a purely off-chain wallet, with no on-chain payments.This allows Spark to fully realize the awesome UX enabled by lightning,without worrying about the complications and friction of on-chain.This might change someday.

Spark has a responsive UI suitable for mobile, tablet and desktop devices,but is best optimized for use on mobile.

⚠️ Spark is beta-quality software under active development, please use with care.

Big shout out to Blockstream for generously sponsoring this work!

Server installation

Requires a running c-lightning node(see setup instructions in the official docsor this tutorial)and nodejs v6.0 or newer (nodejs v8 is recommended, see instructions here.If you're running into permission issues,try this.)

$ npm install -g spark-wallet

$ spark-wallet # defaults: --ln-path ~/.lightning --port 9737

Or simply $ npx spark-wallet, which will install and start Spark in one go.

Spark will generate and print a random username and password that'll be used to login into the wallet,and persist them to ~/.spark-wallet/cookie (can be controlled with --cookie-file <path>).To specify your own login credentials, set --login [user]:[pass] or the LOGIN environment variable.

To access the wallet, open http://localhost:9737/ in your browser and login with the username/password.

You may also start Spark with --pairing-url, which will print a URL with an embedded access token,which you can open in your browser to login into the wallet without using the username/password.--pairing-qr provides the same URL as a QR (useful for mobile pairing).

See $ spark-wallet --help for the full list of command-line options (also available under CLI options).

Configuration file

Spark reads configuration options from ~/.spark-wallet/config (can be overridden with --config/-c <path>).The expected format is one key=value per line, like so:

ln-path=/data/lightning/testnet
login=bob:superSecretPassword
port=8000

Connecting remotely

To accept remote connections, set --host <listen-address> (shorthand -i, e.g. -i 0.0.0.0).This will automatically enable TLS with a self-signed certificate.

For more information on TLS, instructions for setting up a CA-signed certificate using the built-in LetsEncrypt integrationand for adding the self-signed certificate to Android,see doc/tls.md.

To start Spark as a Tor hidden service, set --onion. You don't need Tor pre-installed for this to work.See doc/onion.md for more details,the advantages of using an hidden service, and instructions for connecting from Android.This is highly recommended.

Deploy with Docker

Spark is also available as a Docker image that comes bundled with bitcoind and c-lightning.See doc/docker.md for details.

Adding to startup with systemd

See doc/startup-systemd.md.

Desktop apps

Electron-based desktop apps for Linux (packaged as deb, AppImage, snap and tar.gz),macOS (as zip) and Windows (installer and a portable) are available for download from thereleases page.

The desktop apps comes bundled with the Spark server-side component. If you're connecting to a localc-lightning instance, you can configurethe desktop app to connect to it directly without manually setting up the Spark server.

Connecting to a remote c-lightning instance requires setting up the Spark server on the same machinerunning c-lightning and connecting through it.

Mobile app

A Cordova-based native app for Android is available for download from theGoogle Play app storeor from the releases page.

The app requires a Spark server to communicate with, which you need to setup as a prerequisite.

When the app starts for the first time, you'll need to configure your Spark server URL and API access key.You can print your access key to the console by starting Spark with --print-key/-k.You can also scan this information from a QR, which you can get with --pairing-qr/-Q.

For the native app to properly communicate with the server, the TLS certificate has to besigned by a CAor manually added as a user trusted certificate.

Progressive Web App

You can install Spark as a PWA (on mobile and desktop) to get a more native-app-like experience,including an home launcher that opens up in full screen, system notifications and faster load times.

Available in Chrome mobile under -> Add to homescreen (see here),in Chrome desktop under More tools -> Install to desktop (see here)and in Firefox mobile with an icon next to the address bar (see here).

Installing the PWA requires TLS and a CA-signed certificate(unless accessed via localhost).

Compared to the PWA, the main advantages of the mobile and desktop apps arethe ability to handle lightning: URIs,better security sandbox (detached from the browser)and static client-side code.

Note for iOS users: iOS does notallow PWAs to use WebRTC (required for the QR scanner), but it works otherwise.The QR scanner works if you access Spark without using the PWA "Add to homescreen" feature.

GUI settings & controls

  • Pay and Request are pretty intuitive and don't require much explaining. Try them!

  • Display unit: Click the balance on the top-right or the unit in the "request payment" page to toggle the currency display unit.The available options are sat, bits, milli, btc and usd.

  • Theme switcher: Click the theme name on the bottom-right to change themes (you can choose between 16 bootswatch themes).

  • Payment details: Click on payments in the list to display more details.(note that the fee shown includes c-lightning's overpayment randomization)

  • Expert mode: Click the version number on the bottom-left to toggle expert mode.This will add two new menu items, "Logs" and"RPC Console",and display yaml dumps with additional information throughout the app.

  • Node address: Click the node id on the footer to open the node info page which displays your node address (as text and QR).

  • Channel management: Click the "Channels" button inside the node info page to show and manage channels.

Browser support

Supported on recent desktop and mobile version of Chrome, Firefox and Safari.IE is unsupported.

Requires iOS 11.2+ for WebRTC (used by the QR scanner), but works otherwise with iOS 9+.Chrome on iOS does not support WebRTC.

Developing

Spark is written in a reactive-functional style using rxjs and cycle.js,with bootstrap for theming and a nodejs/express server as the backend.

To start a development server with live compilation for babel, browserify, pug and stylus, run:

$ git clone https://github.com/shesek/spark-wallet && cd spark-wallet
$ npm install
$ npm start -- --ln-path /data/lightning

Spark can be built from source using the following commands (more efficient than running the live compliation development server):

$ git clone https://github.com/shesek/spark-wallet && cd spark-wallet
$ npm install
$ npm run dist:npm
$ ./dist/cli.js --ln-path /data/lightning

Cordova builds can be prepared with npm run dist:cordova.The .apk file will be available in cordova/platforms/android/app/build/outputs/apk/debug/.

Electron builds can be prepared with npm run dist:electron.They will be available under electron/dist.

To get more verbose server-side logging, start the server with --verbose (or -V).

To get more verbose output in the browser developer console, set localStorage.debug = 'spark:*'.

See doc/dev-regtest-env.md for instructions setting up a regtest environment with multiple wallets.

Pull requests, suggestions and comments are welcome!

Code signing & reproducible builds

Signed distribution checksums are available in the git repo atSHA256SUMS.asc(updated with every versioned release)and on the releases page.Git version tags are signed too.

The releases are signed by Nadav Ivgi (@shesek).The public key can be verified on keybase,github,twitter (under bio),HNor on a domain he's known to control.

To install the signed NPM package, download it fromthe releases page, verify the hashand install using $ npm install -g spark-wallet-[x.y.z]-npm.tgz.

To install the signed Docker image, get the image hash from SHA256SUMS.asc and install it with$ docker pull shesek/spark-wallet@sha256:[image-hash-verified-by-be-signed].

The NPM package, Android apk builds, Linux tar.gz/snap builds, macOS zip builds and Windows builds (installer and portable)are deterministically reproducible.

CLI options

$ spark-wallet --help

  A minimalistic wallet GUI for c-lightning

  Usage
    $ spark-wallet [options]

  Options
    -l, --ln-path <path>     path to c-lightning data directory [default: ~/.lightning]
    -p, --port <port>        http(s) server port [default: 9737]
    -i, --host <host>        http(s) server listen address [default: localhost]
    -u, --login <userpwd>    http basic auth login, "username:password" format [default: generate random]
    -C, --cookie-file <path> persist generated login credentials to <path> or load them [default: ~/.spark-wallet/cookie]
    --no-cookie-file         disable cookie file [default: false]

    --rate-provider <name>   exchange rate provider, one of "bitstamp" or "wasabi" (requires tor) [default: bitstamp]
    --no-rates               disable exchange rate lookup [default: false]
    --proxy <uri>            set a proxy for looking up rates, e.g. socks5h://127.0.0.1:9050 [default: none]

    --force-tls              enable TLS even when binding on localhost [default: enable for non-localhost only]
    --no-tls                 disable TLS for non-localhost hosts [default: false]
    --tls-path <path>        directory to read/store key.pem and cert.pem for TLS [default: ~/.spark-wallet/tls/]
    --tls-name <name>        common name for the TLS cert [default: {host}]

    --letsencrypt <email>    enable CA-signed certificate via LetsEncrypt [default: false]
    --le-port <port>         port to bind LetsEncrypt verification server [default: 80]
    --le-noverify            skip starting the LetsEncrypt verification server [default: start when {letsencrypt} is set]
    --le-debug               display additional debug information for LetsEncrypt [default: false]

    -o, --onion              start Tor Hidden Service (v3) [default: false]
    -O, --onion-path <path>  directory to read/store hidden service data [default: ~/.spark-wallet/tor/]
    --onion-nonanonymous     setup hidden service in non-anonymous mode [default: false]

    -k, --print-key          print access key to console (for use with the Cordova/Electron apps) [default: false]
    -q, --print-qr           print QR code with the server URL [default: false]
    -Q, --pairing-qr         print QR code with embedded access key [default: false]
    -P, --pairing-url        print URL with embedded access key [default: false]
    --public-url <url>       override public URL used for QR codes [default: http(s)://{host}/]

    --allow-cors <origin>    allow browser CORS requests from <origin> (USE WITH CARE) [default: off]
    --no-webui               run API server without serving client assets [default: false]
    --no-test-conn           skip testing access to c-lightning rpc (useful for init scripts) [default: false]

    -c, --config <path>      path to config file [default: ~/.spark-wallet/config]
    -V, --verbose            display debugging information [default: false]
    -h, --help               output usage information
    -v, --version            output version number

  Example
    $ spark-wallet -l ~/.lightning

  All options may also be specified as environment variables:
    $ LN_PATH=/data/lightning PORT=8070 NO_TLS=1 spark-wallet

License

MIT

  • 声明:版权所有,转载请联系作者并注明出处  http://blog.csdn.net/u013719780?viewmode=contents 博主简介:风雪夜归子(Allen),机器学习算法攻城狮,喜爱钻研Meachine Learning的黑科技,对Deep Learning和Artificial Intelligence充满兴趣,经常关注Kaggle数据挖掘竞赛平台,对数据、Machine

  • BlockId identified a particular Block of data, usually associated with a single file. A Block can be uniquely identified by its filenme, but eatch type of Block has a dirrent set of keys which produce

  • package com.latrobe.spark import org.apache.spark.{SparkContext, SparkConf} /** * Created by spark on 15-1-18. */ object FoldByKey { def main(args: Array[String]) { val conf = new SparkConf(

  • Spark四大组件包括Spark Streaming、Spark SQL、Spark MLlib和Spark GraphX。它们的主要应用场景是: Spark Streaming: Spark Streaming基于微批量方式的计算和处理,可以用于处理实时的流数据。它使用DStream,简单来说就是一个弹性分布式数据集(RDD)系列,处理实时数据。 Spark SQL: Spark SQL可以通过

  • 一、Spark介绍 Spark是用于大规模数据处理的统一分析引擎 Spark借鉴了MapReduce思想发展而来,保留了其分布式并行计算的优点并改进了其明显的缺陷.让中间数据存储在内存中提高了运行速度、并提供丰富的操作数据的API提高了开发速度 二、RDD介绍 RDD(分布式内存抽象),使得程序员可以在大规模集群中做内存计算,并且有一定的容错方式,是Spark的核心数据结构 三、Spark编程模型

  • Spark Standalone         Spark Standalone模式中,资源调度是Spark框架自己实现的,其节点类型分为Master节点和Worker节点,其中Driver运行在Master节点中,并且有常驻内存的Master进程守护,Worker节点上常驻Worker守护进程,负责与Master通信,通过ExecutorRunner来控制运行在当前节点上的CoarseGrai

  • 网上关于Spark 读写 clickhouse的文章不少,但我认为适用你的可能还真不多。看看本文是否能给你开启一个新思路? 一、Spark消费Kafka后写入Clickhouse 注意,clickhouse集群部署?kafka集群部署?Spark消费Kafka的CDC过程 怎么实现?怎么实现一次性语义?等不在本文的讨论范围。本文主要想给出一种写clickhouse的一种方式。 二、参考代码 ...

 相关资料
  • 主要内容:1.RDD特点:,2.RDD的 5大属性,3.RDD的执行原理,4.Spark的核心组件1.RDD特点: 可变: 存储的弹性 容错的弹性 计算的弹性 分片的弹性 RDD 代码中是一个抽象类, 代表弹性的, 不可变, 可分区, 里面的元素可并行计算的集合, 为弹性分布式数据集。 RDD 不保存数据, 但是有血缘关系。 不可变的是逻辑, 如果想加入新的逻辑, 必须封装。 2.RDD的 5大属性 分区列表 分区计算函数 多个RDD有依赖关系 分区器: 一个分区的规则, 和Kafka 类似

  • 主要内容:1.From Memory,2.From File,3.From File1.From Memory 这里的makeRDD和parallelize没有区别, make底层用的就是parallelize函数 2.From File 3.From File 第二个方法返回了完整路径

  • 为什么 Spark 提供统一处理数据的方式(类似 SQL 基于 RDBMS) 降低应用开发和维护的成本 专注于怎么样使用工具,而不需要理解工具底层的原理 其他

  • 主要内容:1.Spark特点,2.Spark相对于Hadoop的优势,3.Spark生态系统,4.Spark基本概念1.Spark特点 Spark具有如下几个主要特点: 运行速度快:Spark使用先进的DAG(Directed Acyclic Graph,有向无环图)执行引擎,以支持循环数据流与内存计算,基于内存的执行速度可比Hadoop MapReduce快上百倍,基于磁盘的执行速度也能快十倍; 容易使用:Spark支持使用Scala、Java、Python和R语言进行编程,简洁的API设计有

  • 我试图在Kubernetes上运行Spark作为调度程序。 当使用从kubernetes集群外部运行时,它可以正常工作。 但是,每当我们尝试从pod中直接运行spark-shell或spark-submit时,它都不会起作用(即使使用从spark文档中执行rbac也不会起作用。我们有授权执行异常: io.fabric8.kubernetes.client.kubernetesclientExcep