Compare commits

...

2 commits

Author SHA1 Message Date
78845777ad
julia_15: nixpkgs unstable → 21.05
Apparently the working julia version was moved into a stable channel since I
last looked at this, and the version in unstable is broken again. Fun!

Anyways, it builds again now (as part of parsons's config)
2021-08-26 22:11:54 +02:00
a00e28d85a
add pluto notebook server
Pluto [1] is one of these interactive notebook thingies that have become
so unreasonably popular with people doing machine learning or data
analysis, but – somewhat surprisingly – it's actually not shit (e.g. no
global mutable state in the notebook, no weird unreadable fileformat
that doesn't play well with version control, etc.)

In particular, it can be used collaboratively (while it doesn't do
real-time collaborative editing like a pad, it /does/ push out global
updates each time someone executes a cell, so it's reasonably close),
and I think it may be useful to have for julia-hacking sessions.

It may also be useful for people running low-end laptops, since code is
executed on the host — and I guess hainich has enough unused ressources
lying around that we can spare a few.

After deploying this, the notebook server should be reachable via:
  ssh hainich -L 9999:localhost:9999
and then visiting http://localhost:9999

Caveats: by design, pluto allows a user to execute arbitrary code on the
host. That is its main function, and not something we can prevent. I've
tried to mitigate this as far as possible by:
 - only allowing access via ssh port forwarding. In theory pluto does
   have basic access control, but that works via a secret link that
   it'll spit to stdout on startup (i.e. the journal), which cannot be
   set in advance, nor regenerted without restarting the entire process.
   Unfortunately, this means we won't be able to use it at e.g.
   conference sessions with people who don't have access to our infra
 - running it in a nixos-container as its own user, so it should never
   get any kind of access to the "main" directory tree apart from a
   single directory that we can keep notebooks in (which is currently a
   bind mount set to /data/pluto)
 - limiting memory and cpu for that container via systemd (less out of
   worry for exploits, and more so that a few accidental while-true
   loops will never consume enough cpu time to noticebly slow down
   anything else). The current limits for both a chosen relatively low;
   we'll have to see if they become too limiting should anyone run an
   actual weather model on this.

Things we could also do:
 - currently, the container does not have its own network (mostly since
   that would make it slightly less convenient to use with port
   forwarding); in theory, pluto should even be able to run entirely
   without internet access of its own, but I'm not sure if this would
   break things like loading images / raw data into a notebook
 - make the container ephemeral, and only keep the directory containing
   the notebooks. I haven't done this since it would require
   recompilation of pluto each time the container is wiped, which makes
   for a potentially inconvenient startup time (though still < 3-5 mins)

Questions:
 - have I missed anything important that should definitely be also
   sandboxed / limited in some way?
 - in general, are we comfortable running something like this?
 - would we (in principle) be comfortable opening this up to other
   people for congress sessions (assuming we figure out a reasonable
   access control)?

Notes to deployer:
 - while I have not tested this on hainich, it works on my own server
 - you will probably have to create the /data/pluto directory for the
   bind mount, and make it world-writable (or chown it to the pluto user
   inside the container)

[1] https://github.com/fonsp/Pluto.jl/
2021-08-26 21:27:49 +02:00
12 changed files with 1433 additions and 0 deletions

View file

@ -21,6 +21,7 @@
../../services/gitlab-runner.nix
../../services/unifi.nix
../../services/lantifa.nix
../../services/pluto.nix
./lxc.nix
];

199
pkgs/pluto/Manifest.toml Normal file
View file

@ -0,0 +1,199 @@
# This file is machine-generated - editing it directly is not advised
[[Artifacts]]
deps = ["Pkg"]
git-tree-sha1 = "c30985d8821e0cd73870b17b0ed0ce6dc44cb744"
uuid = "56f22d72-fd6d-98f1-02f0-08ddc0907c33"
version = "1.3.0"
[[Base64]]
uuid = "2a0f44e3-6c83-55bd-87e4-b1978d98bd5f"
[[Configurations]]
deps = ["Crayons", "ExproniconLite", "OrderedCollections", "TOML"]
git-tree-sha1 = "b8486a417456d2fbbe2af13e24cef459c9f42429"
uuid = "5218b696-f38b-4ac9-8b61-a12ec717816d"
version = "0.15.4"
[[Crayons]]
git-tree-sha1 = "3f71217b538d7aaee0b69ab47d9b7724ca8afa0d"
uuid = "a8cc5b0e-0ffa-5ad4-8c14-923d3ee1735f"
version = "4.0.4"
[[DataAPI]]
git-tree-sha1 = "dfb3b7e89e395be1e25c2ad6d7690dc29cc53b1d"
uuid = "9a962f9c-6df0-11e9-0e5d-c546b8b5ee8a"
version = "1.6.0"
[[DataValueInterfaces]]
git-tree-sha1 = "bfc1187b79289637fa0ef6d4436ebdfe6905cbd6"
uuid = "e2d170a0-9d28-54be-80f0-106bbe20a464"
version = "1.0.0"
[[Dates]]
deps = ["Printf"]
uuid = "ade2ca70-3891-5945-98fb-dc099432e06a"
[[Distributed]]
deps = ["Random", "Serialization", "Sockets"]
uuid = "8ba89e20-285c-5b6f-9357-94700520ee1b"
[[ExproniconLite]]
git-tree-sha1 = "c97ce5069033ac15093dc44222e3ecb0d3af8966"
uuid = "55351af7-c7e9-48d6-89ff-24e801d99491"
version = "0.6.9"
[[FuzzyCompletions]]
deps = ["REPL"]
git-tree-sha1 = "9cde086faa37f32794be3d2df393ff064d43cd66"
uuid = "fb4132e2-a121-4a70-b8a1-d5b831dcdcc2"
version = "0.4.1"
[[HTTP]]
deps = ["Base64", "Dates", "IniFile", "MbedTLS", "NetworkOptions", "Sockets", "URIs"]
git-tree-sha1 = "b855bf8247d6e946c75bb30f593bfe7fe591058d"
uuid = "cd3eb016-35fb-5094-929b-558a96fad6f3"
version = "0.9.8"
[[IniFile]]
deps = ["Test"]
git-tree-sha1 = "098e4d2c533924c921f9f9847274f2ad89e018b8"
uuid = "83e8ac13-25f8-5344-8a64-a9f2b223428f"
version = "0.5.0"
[[InteractiveUtils]]
deps = ["Markdown"]
uuid = "b77e0a4c-d291-57a0-90e8-8db25a27a240"
[[IteratorInterfaceExtensions]]
git-tree-sha1 = "a3f24677c21f5bbe9d2a714f95dcd58337fb2856"
uuid = "82899510-4779-5014-852e-03e436cf321d"
version = "1.0.0"
[[JLLWrappers]]
deps = ["Preferences"]
git-tree-sha1 = "642a199af8b68253517b80bd3bfd17eb4e84df6e"
uuid = "692b3bcd-3c85-4b1f-b108-f13ce0eb3210"
version = "1.3.0"
[[LibGit2]]
deps = ["Printf"]
uuid = "76f85450-5226-5b5a-8eaa-529ad045b433"
[[Libdl]]
uuid = "8f399da3-3557-5675-b5ff-fb832c97cbdb"
[[LinearAlgebra]]
deps = ["Libdl"]
uuid = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
[[Logging]]
uuid = "56ddb016-857b-54e1-b83d-db4d58db5568"
[[Markdown]]
deps = ["Base64"]
uuid = "d6f4376e-aef5-505a-96c1-9c027394607a"
[[MbedTLS]]
deps = ["Dates", "MbedTLS_jll", "Random", "Sockets"]
git-tree-sha1 = "1c38e51c3d08ef2278062ebceade0e46cefc96fe"
uuid = "739be429-bea8-5141-9913-cc70e7f3736d"
version = "1.0.3"
[[MbedTLS_jll]]
deps = ["Artifacts", "JLLWrappers", "Libdl", "Pkg"]
git-tree-sha1 = "0eef589dd1c26a3ac9d753fe1a8bcad63f956fa6"
uuid = "c8ffd9c3-330d-5841-b78e-0817d7145fa1"
version = "2.16.8+1"
[[MsgPack]]
deps = ["Serialization"]
git-tree-sha1 = "a8cbf066b54d793b9a48c5daa5d586cf2b5bd43d"
uuid = "99f44e22-a591-53d1-9472-aa23ef4bd671"
version = "1.1.0"
[[NetworkOptions]]
git-tree-sha1 = "ed3157f48a05543cce9b241e1f2815f7e843d96e"
uuid = "ca575930-c2e3-43a9-ace4-1e988b2c1908"
version = "1.2.0"
[[OrderedCollections]]
git-tree-sha1 = "85f8e6578bf1f9ee0d11e7bb1b1456435479d47c"
uuid = "bac558e1-5e72-5ebc-8fee-abe8a469f55d"
version = "1.4.1"
[[Pkg]]
deps = ["Dates", "LibGit2", "Libdl", "Logging", "Markdown", "Printf", "REPL", "Random", "SHA", "UUIDs"]
uuid = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
[[Pluto]]
deps = ["Base64", "Configurations", "Dates", "Distributed", "FuzzyCompletions", "HTTP", "InteractiveUtils", "Logging", "Markdown", "MsgPack", "Pkg", "REPL", "Sockets", "TableIOInterface", "Tables", "UUIDs"]
git-tree-sha1 = "85156b21dee3a4515ff479555eb958ad33c057aa"
uuid = "c3e4b0f8-55cb-11ea-2926-15256bba5781"
version = "0.14.5"
[[Preferences]]
deps = ["TOML"]
git-tree-sha1 = "00cfd92944ca9c760982747e9a1d0d5d86ab1e5a"
uuid = "21216c6a-2e73-6563-6e65-726566657250"
version = "1.2.2"
[[Printf]]
deps = ["Unicode"]
uuid = "de0858da-6303-5e67-8744-51eddeeeb8d7"
[[REPL]]
deps = ["InteractiveUtils", "Markdown", "Sockets"]
uuid = "3fa0cd96-eef1-5676-8a61-b3b8758bbffb"
[[Random]]
deps = ["Serialization"]
uuid = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
[[SHA]]
uuid = "ea8e919c-243c-51af-8825-aaa63cd721ce"
[[Serialization]]
uuid = "9e88b42a-f829-5b0c-bbe9-9e923198166b"
[[Sockets]]
uuid = "6462fe0b-24de-5631-8697-dd941f90decc"
[[TOML]]
deps = ["Dates"]
git-tree-sha1 = "44aaac2d2aec4a850302f9aa69127c74f0c3787e"
uuid = "fa267f1f-6049-4f14-aa54-33bafae1ed76"
version = "1.0.3"
[[TableIOInterface]]
git-tree-sha1 = "9a0d3ab8afd14f33a35af7391491ff3104401a35"
uuid = "d1efa939-5518-4425-949f-ab857e148477"
version = "0.1.6"
[[TableTraits]]
deps = ["IteratorInterfaceExtensions"]
git-tree-sha1 = "c06b2f539df1c6efa794486abfb6ed2022561a39"
uuid = "3783bdb8-4a98-5b6b-af9a-565f29a5fe9c"
version = "1.0.1"
[[Tables]]
deps = ["DataAPI", "DataValueInterfaces", "IteratorInterfaceExtensions", "LinearAlgebra", "TableTraits", "Test"]
git-tree-sha1 = "c9d2d262e9a327be1f35844df25fe4561d258dc9"
uuid = "bd369af6-aec1-5ad0-b16a-f7cc5008161c"
version = "1.4.2"
[[Test]]
deps = ["Distributed", "InteractiveUtils", "Logging", "Random"]
uuid = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
[[URIs]]
git-tree-sha1 = "97bbe755a53fe859669cd907f2d96aee8d2c1355"
uuid = "5c2747f8-b7ea-4ff2-ba2e-563bfd36b1d4"
version = "1.3.0"
[[UUIDs]]
deps = ["Random", "SHA"]
uuid = "cf7118a7-6976-5b1a-9a39-7adc72f591a4"
[[Unicode]]
uuid = "4ec0a83e-493e-50e2-b9ac-8f72acf5a8f5"

2
pkgs/pluto/Project.toml Normal file
View file

@ -0,0 +1,2 @@
[deps]
Pluto = "c3e4b0f8-55cb-11ea-2926-15256bba5781"

50
pkgs/pluto/Readme.org Normal file
View file

@ -0,0 +1,50 @@
#+TITLE: Pluto standalone
This is a nix derivation that wraps a version of julia with the packages
(and their dependency closure) defined in ~Packages.toml~; currently
this is just Pluto. It also provides a little julia script that will
activate this packageset (called a "julia depot"), and then start a
Pluto notebook server on port 9999. Note that it does so without setting
up any kind of authenticaton, so don't expose that port!
* TODOs
- [ ] add more packages
- [ ] more sensible auth
- [ ] working precompilation; this would probably allow running this
without a writable julia depot
* Steps to reproduce / update the julia-to-nix part of this:
In general, it is enough to follow the readme of julia2nix, but since
that has a bunch of unstated assumptions and weird failure modes, here's a rough outline of how to actually do it:
- using julia's package managing mode, make changes to ~Packages.toml~
and ~Manifest.toml~. Be sure to use the same version of julia that
will run on hainich (currently the ~julia_15~ package of
nixpkgs-unstable), as otherwise hashes may be different
- clone https://github.com/thomasjm/julia2nix somewhere
- run ~julia2nix~ to generate the nix derivations
+ unfortunately, jula2nix assumes that nixpkgs-unstable is available
as ~<nixpkgs>~ from within nix; it may fail if you are on another
channel. In that case, there seems to be no better solution than grepping for occurences of "<nixpkgs>" in jula2nix and replacing
them with some other path that has the required version
+ this will re-generate all the ~*.nix~ files in this directory, and
probably reset all options. The defaults are reasonably sensible,
but make sure to disable the ~precompile~ option in ~default.nix~
(see below for why)
- run ~nix-build --no-out-link~ to check if it worked and nix can
build the julia depot
- deploy hainich. Note that the derivation will only contain the
package sources, not a compiled version. Julia will compile packages
on startup (and cache them for subsequent runs), so after the deploy
it may take a minute or two to actually run Pluto
* precompilation
/In theory/, we should be able to precompile all the packages during
nix-build, and then directly load them into julia and runtime.
However, this currently fails, and even if precompiled packages are
present in the depot built via nix, julia will refuse to use them and
recompile them instead (in a second julia depot that is writable at
runtime).
There's an open issue on this at jula2nix:
https://github.com/thomasjm/julia2nix/issues/22

160
pkgs/pluto/common.nix Normal file
View file

@ -0,0 +1,160 @@
{
callPackage,
curl,
fetchurl,
git,
stdenvNoCC,
cacert,
jq,
julia,
lib,
python3,
runCommand,
stdenv,
writeText,
makeWrapper,
# Arguments
makeWrapperArgs ? "",
precompile ? true,
extraBuildInputs ? []
}:
let
# We need to use a specially modified fetchgit that understands tree hashes, until
# https://github.com/NixOS/nixpkgs/pull/104714 lands
fetchgit = callPackage ./fetchgit {};
packages = callPackage ./packages.nix {};
### Repoify packages
# This step is needed because leaveDotGit is not reproducible
# https://github.com/NixOS/nixpkgs/issues/8567
repoified = map (item: if item.src == null then item else item // { src = repoify item.name item.treehash item.src; }) packages.closure;
repoify = name: treehash: src:
runCommand ''${name}-repoified'' {buildInputs = [git];} ''
mkdir -p $out
cp -r ${src}/. $out
cd $out
git init
git add . -f
git config user.email "julia2nix@localhost"
git config user.name "julia2nix"
git commit -m "Dummy commit"
if [[ -n "${treehash}" ]]; then
if [[ $(git cat-file -t ${treehash}) != "tree" ]]; then
echo "Couldn't find desired tree object for ${name} in repoify (${treehash})"
exit 1
fi
fi
'';
repoifiedReplaceInManifest = lib.filter (x: x.replaceUrlInManifest != null) repoified;
### Manifest.toml (processed)
manifestToml = runCommand "Manifest.toml" { buildInputs = [jq]; } ''
cp ${./Manifest.toml} ./Manifest.toml
echo ${writeText "packages.json" (lib.generators.toJSON {} repoifiedReplaceInManifest)}
cat ${writeText "packages.json" (lib.generators.toJSON {} repoifiedReplaceInManifest)} | jq -r '.[]|[.name, .replaceUrlInManifest, .src] | @tsv' |
while IFS=$'\t' read -r name replaceUrlInManifest src; do
sed -i "s|$replaceUrlInManifest|file://$src|g" ./Manifest.toml
done
cp ./Manifest.toml $out
'';
### Overrides.toml
fetchArtifact = x: stdenv.mkDerivation {
name = x.name;
src = fetchurl { url = x.url; sha256 = x.sha256; };
sourceRoot = ".";
dontConfigure = true;
dontBuild = true;
installPhase = "cp -r . $out";
dontFixup = true;
};
artifactOverrides = lib.zipAttrsWith (name: values: fetchArtifact (lib.head (lib.head values))) (
map (item: item.artifacts) packages.closure
);
overridesToml = runCommand "Overrides.toml" { buildInputs = [jq]; } ''
echo '${lib.generators.toJSON {} artifactOverrides}' | jq -r '. | to_entries | map ((.key + " = \"" + .value + "\"")) | .[]' > $out
'';
### Processed registry
generalRegistrySrc = repoify "julia-general" "" (fetchgit {
url = packages.registryUrl;
rev = packages.registryRev;
sha256 = packages.registrySha256;
branchName = "master";
});
registry = runCommand "julia-registry" { buildInputs = [(python3.withPackages (ps: [ps.toml])) jq git]; } ''
git clone ${generalRegistrySrc}/. $out
cd $out
cat ${writeText "packages.json" (lib.generators.toJSON {} repoified)} | jq -r '.[]|[.name, .path, .src] | @tsv' |
while IFS=$'\t' read -r name path src; do
# echo "Processing: $name, $path, $src"
if [[ "$path" != "null" ]]; then
python -c "import toml; \
packageTomlPath = '$path/Package.toml'; \
contents = toml.load(packageTomlPath); \
contents['repo'] = 'file://$src'; \
f = open(packageTomlPath, 'w'); \
f.write(toml.dumps(contents)); \
"
fi
done
export HOME=$(pwd)
git config --global user.email "julia-to-nix-depot@email.com"
git config --global user.name "julia-to-nix-depot script"
git add .
git commit -m "Switch to local package repos"
'';
depot = runCommand "julia-depot" {
buildInputs = [git curl julia] ++ extraBuildInputs;
inherit registry precompile;
} ''
export HOME=$(pwd)
echo "Using registry $registry"
echo "Using Julia ${julia}/bin/julia"
cp ${manifestToml} ./Manifest.toml
cp ${./Project.toml} ./Project.toml
mkdir -p $out/artifacts
cp ${overridesToml} $out/artifacts/Overrides.toml
export JULIA_DEPOT_PATH=$out
julia -e ' \
using Pkg
Pkg.Registry.add(RegistrySpec(path="${registry}"))
Pkg.activate(".")
Pkg.instantiate()
# Remove the registry to save space
Pkg.Registry.rm("General")
'
if [[ -n "$precompile" ]]; then
julia -e ' \
using Pkg
Pkg.activate(".")
Pkg.precompile()
'
fi
'';
in
runCommand "julia-env" {
inherit julia depot makeWrapperArgs;
buildInputs = [makeWrapper];
} ''
mkdir -p $out/bin
makeWrapper $julia/bin/julia $out/bin/julia --suffix JULIA_DEPOT_PATH : "$depot" $makeWrapperArgs
''

44
pkgs/pluto/default.nix Normal file
View file

@ -0,0 +1,44 @@
{ pkgs }:
with pkgs;
let
# The base Julia version
baseJulia = julia_15;
# Extra libraries for Julia's LD_LIBRARY_PATH.
# Recent Julia packages that use Artifacts.toml to specify their dependencies
# shouldn't need this.
# But if a package implicitly depends on some library being present at runtime, you can
# add it here.
extraLibs = [];
# Wrapped Julia with libraries and environment variables.
# Note: setting The PYTHON environment variable is recommended to prevent packages
# from trying to obtain their own with Conda.
julia = runCommand "julia-wrapped" { buildInputs = [makeWrapper]; } ''
mkdir -p $out/bin
makeWrapper ${baseJulia}/bin/julia $out/bin/julia \
--suffix LD_LIBRARY_PATH : "${lib.makeLibraryPath extraLibs}" \
--set PYTHON ${python3}/bin/python
'';
in
callPackage ./common.nix {
inherit julia;
# Run Pkg.precompile() to precompile all packages?
precompile = false;
# Extra arguments to makeWrapper when creating the final Julia wrapper.
# By default, it will just put the new depot at the end of JULIA_DEPOT_PATH.
# You can add additional flags here.
makeWrapperArgs = "";
# Extra buildInputs for building the Julia depot. Useful if your packages have
# additional build-time dependencies not managed through the Artifacts.toml system.
# Defaults to extraLibs, but can be configured independently.
extraBuildInputs = extraLibs;
}

View file

@ -0,0 +1,16 @@
# tested so far with:
# - no revision specified and remote has a HEAD which is used
# - revision specified and remote has a HEAD
# - revision specified and remote without HEAD
source $stdenv/setup
header "exporting $url (rev $rev) into $out"
$SHELL $fetcher --builder --url "$url" --out "$out" --rev "$rev" \
${leaveDotGit:+--leave-dotGit} \
${deepClone:+--deepClone} \
${fetchSubmodules:+--fetch-submodules} \
${branchName:+--branch-name "$branchName"}
runHook postFetch
stopNest

View file

@ -0,0 +1,71 @@
{stdenvNoCC, git, cacert}: let
urlToName = url: rev: let
inherit (stdenvNoCC.lib) removeSuffix splitString last;
base = last (splitString ":" (baseNameOf (removeSuffix "/" url)));
matched = builtins.match "(.*).git" base;
short = builtins.substring 0 7 rev;
appendShort = if (builtins.match "[a-f0-9]*" rev) != null
then "-${short}"
else "";
in "${if matched == null then base else builtins.head matched}${appendShort}";
in
{ url, rev ? "HEAD", md5 ? "", sha256 ? "", leaveDotGit ? deepClone
, fetchSubmodules ? true, deepClone ? false
, branchName ? null
, name ? urlToName url rev
, # Shell code executed after the file has been fetched
# successfully. This can do things like check or transform the file.
postFetch ? ""
, preferLocalBuild ? true
}:
/* NOTE:
fetchgit has one problem: git fetch only works for refs.
This is because fetching arbitrary (maybe dangling) commits may be a security risk
and checking whether a commit belongs to a ref is expensive. This may
change in the future when some caching is added to git (?)
Usually refs are either tags (refs/tags/*) or branches (refs/heads/*)
Cloning branches will make the hash check fail when there is an update.
But not all patches we want can be accessed by tags.
The workaround is getting the last n commits so that it's likely that they
still contain the hash we want.
for now : increase depth iteratively (TODO)
real fix: ask git folks to add a
git fetch $HASH contained in $BRANCH
facility because checking that $HASH is contained in $BRANCH is less
expensive than fetching --depth $N.
Even if git folks implemented this feature soon it may take years until
server admins start using the new version?
*/
assert deepClone -> leaveDotGit;
if md5 != "" then
throw "fetchgit does not support md5 anymore, please use sha256"
else
stdenvNoCC.mkDerivation {
inherit name;
builder = ./builder.sh;
fetcher = ./nix-prefetch-git; # This must be a string to ensure it's called with bash.
nativeBuildInputs = [git];
outputHashAlgo = "sha256";
outputHashMode = "recursive";
outputHash = sha256;
inherit url rev leaveDotGit fetchSubmodules deepClone branchName postFetch;
GIT_SSL_CAINFO = "${cacert}/etc/ssl/certs/ca-bundle.crt";
impureEnvVars = stdenvNoCC.lib.fetchers.proxyImpureEnvVars ++ [
"GIT_PROXY_COMMAND" "SOCKS_SERVER"
];
inherit preferLocalBuild;
}

View file

@ -0,0 +1,459 @@
#! /usr/bin/env bash
set -e -o pipefail
url=
rev=
expHash=
hashType=$NIX_HASH_ALGO
deepClone=$NIX_PREFETCH_GIT_DEEP_CLONE
leaveDotGit=$NIX_PREFETCH_GIT_LEAVE_DOT_GIT
fetchSubmodules=
builder=
branchName=$NIX_PREFETCH_GIT_BRANCH_NAME
# ENV params
out=${out:-}
http_proxy=${http_proxy:-}
# populated by clone_user_rev()
fullRev=
humanReadableRev=
commitDate=
commitDateStrict8601=
if test -n "$deepClone"; then
deepClone=true
else
deepClone=
fi
if test "$leaveDotGit" != 1; then
leaveDotGit=
else
leaveDotGit=true
fi
usage(){
echo >&2 "syntax: nix-prefetch-git [options] [URL [REVISION [EXPECTED-HASH]]]
Options:
--out path Path where the output would be stored.
--url url Any url understood by 'git clone'.
--rev ref Any sha1 or references (such as refs/heads/master)
--hash h Expected hash.
--branch-name Branch name to check out into
--deepClone Clone the entire repository.
--no-deepClone Make a shallow clone of just the required ref.
--leave-dotGit Keep the .git directories.
--fetch-submodules Fetch submodules.
--builder Clone as fetchgit does, but url, rev, and out option are mandatory.
--quiet Only print the final json summary.
"
exit 1
}
# some git commands print to stdout, which would contaminate our JSON output
clean_git(){
git "$@" >&2
}
argi=0
argfun=""
for arg; do
if test -z "$argfun"; then
case $arg in
--out) argfun=set_out;;
--url) argfun=set_url;;
--rev) argfun=set_rev;;
--hash) argfun=set_hashType;;
--branch-name) argfun=set_branchName;;
--deepClone) deepClone=true;;
--quiet) QUIET=true;;
--no-deepClone) deepClone=;;
--leave-dotGit) leaveDotGit=true;;
--fetch-submodules) fetchSubmodules=true;;
--builder) builder=true;;
-h|--help) usage; exit;;
*)
: $((++argi))
case $argi in
1) url=$arg;;
2) rev=$arg;;
3) expHash=$arg;;
*) exit 1;;
esac
;;
esac
else
case $argfun in
set_*)
var=${argfun#set_}
eval $var=$arg
;;
esac
argfun=""
fi
done
if test -z "$url"; then
usage
fi
init_remote(){
local url=$1
clean_git init
clean_git remote add origin "$url"
( [ -n "$http_proxy" ] && clean_git config http.proxy "$http_proxy" ) || true
}
# Return the reference of an hash if it exists on the remote repository.
ref_from_hash(){
local hash=$1
git ls-remote origin | sed -n "\,$hash\t, { s,\(.*\)\t\(.*\),\2,; p; q}"
}
# Return the hash of a reference if it exists on the remote repository.
hash_from_ref(){
local ref=$1
git ls-remote origin | sed -n "\,\t$ref, { s,\(.*\)\t\(.*\),\1,; p; q}"
}
# Returns a name based on the url and reference
#
# This function needs to be in sync with nix's fetchgit implementation
# of urlToName() to re-use the same nix store paths.
url_to_name(){
local url=$1
local ref=$2
local base
base=$(basename "$url" .git | cut -d: -f2)
if [[ $ref =~ ^[a-z0-9]+$ ]]; then
echo "$base-${ref:0:7}"
else
echo "$base"
fi
}
# Fetch everything and checkout the right sha1
checkout_hash(){
local hash="$1"
local ref="$2"
if test -z "$hash"; then
hash=$(hash_from_ref "$ref")
fi
clean_git fetch -t ${builder:+--progress} origin || return 1
local object_type=$(git cat-file -t "$hash")
if [[ "$object_type" == "commit" ]]; then
clean_git checkout -b "$branchName" "$hash" || return 1
elif [[ "$object_type" == "tree" ]]; then
clean_git config user.email "nix-prefetch-git@localhost"
clean_git config user.name "nix-prefetch-git"
commit_id=$(git commit-tree "$hash" -m "Commit created from tree hash $hash")
clean_git checkout -b "$branchName" "$commit_id" || return 1
else
echo "Unrecognized git object type: $object_type"
return 1
fi
}
# Fetch only a branch/tag and checkout it.
checkout_ref(){
local hash="$1"
local ref="$2"
if [[ -n "$deepClone" ]]; then
# The caller explicitly asked for a deep clone. Deep clones
# allow "git describe" and similar tools to work. See
# https://marc.info/?l=nix-dev&m=139641582514772
# for a discussion.
return 1
fi
if test -z "$ref"; then
ref=$(ref_from_hash "$hash")
fi
if test -n "$ref"; then
# --depth option is ignored on http repository.
clean_git fetch ${builder:+--progress} --depth 1 origin +"$ref" || return 1
clean_git checkout -b "$branchName" FETCH_HEAD || return 1
else
return 1
fi
}
# Update submodules
init_submodules(){
# Add urls into .git/config file
clean_git submodule init
# list submodule directories and their hashes
git submodule status |
while read -r l; do
local hash
local dir
local name
local url
# checkout each submodule
hash=$(echo "$l" | awk '{print $1}' | tr -d '-')
dir=$(echo "$l" | sed -n 's/^.[0-9a-f]\+ \(.*[^)]*\)\( (.*)\)\?$/\1/p')
name=$(
git config -f .gitmodules --get-regexp submodule\..*\.path |
sed -n "s,^\(.*\)\.path $dir\$,\\1,p")
url=$(git config --get "${name}.url")
clone "$dir" "$url" "$hash" ""
done
}
clone(){
local top=$PWD
local dir="$1"
local url="$2"
local hash="$3"
local ref="$4"
cd "$dir"
# Initialize the repository.
init_remote "$url"
# Download data from the repository.
checkout_ref "$hash" "$ref" ||
checkout_hash "$hash" "$ref" || (
echo 1>&2 "Unable to checkout $hash$ref from $url."
exit 1
)
# Checkout linked sources.
if test -n "$fetchSubmodules"; then
init_submodules
fi
if [ -z "$builder" ] && [ -f .topdeps ]; then
if tg help &>/dev/null; then
echo "populating TopGit branches..."
tg remote --populate origin
else
echo "WARNING: would populate TopGit branches but TopGit is not available" >&2
echo "WARNING: install TopGit to fix the problem" >&2
fi
fi
cd "$top"
}
# Remove all remote branches, remove tags not reachable from HEAD, do a full
# repack and then garbage collect unreferenced objects.
make_deterministic_repo(){
local repo="$1"
# run in sub-shell to not touch current working directory
(
cd "$repo"
# Remove files that contain timestamps or otherwise have non-deterministic
# properties.
rm -rf .git/logs/ .git/hooks/ .git/index .git/FETCH_HEAD .git/ORIG_HEAD \
.git/refs/remotes/origin/HEAD .git/config
# Remove all remote branches.
git branch -r | while read -r branch; do
clean_git branch -rD "$branch"
done
# Remove tags not reachable from HEAD. If we're exactly on a tag, don't
# delete it.
maybe_tag=$(git tag --points-at HEAD)
git tag --contains HEAD | while read -r tag; do
if [ "$tag" != "$maybe_tag" ]; then
clean_git tag -d "$tag"
fi
done
# Do a full repack. Must run single-threaded, or else we lose determinism.
clean_git config pack.threads 1
clean_git repack -A -d -f
rm -f .git/config
# Garbage collect unreferenced objects.
# Note: --keep-largest-pack prevents non-deterministic ordering of packs
# listed in .git/objects/info/packs by only using a single pack
clean_git gc --prune=all --keep-largest-pack
)
}
clone_user_rev() {
local dir="$1"
local url="$2"
local rev="${3:-HEAD}"
# Perform the checkout.
case "$rev" in
HEAD|refs/*)
clone "$dir" "$url" "" "$rev" 1>&2;;
*)
if test -z "$(echo "$rev" | tr -d 0123456789abcdef)"; then
clone "$dir" "$url" "$rev" "" 1>&2
else
# if revision is not hexadecimal it might be a tag
clone "$dir" "$url" "" "refs/tags/$rev" 1>&2
fi;;
esac
pushd "$dir" >/dev/null
fullRev=$( (git rev-parse "$rev" 2>/dev/null || git rev-parse "refs/heads/$branchName") | tail -n1)
humanReadableRev=$(git describe "$fullRev" 2> /dev/null || git describe --tags "$fullRev" 2> /dev/null || echo -- none --)
commitDate=$(git show -1 --no-patch --pretty=%ci "$fullRev")
commitDateStrict8601=$(git show -1 --no-patch --pretty=%cI "$fullRev")
popd >/dev/null
# Allow doing additional processing before .git removal
eval "$NIX_PREFETCH_GIT_CHECKOUT_HOOK"
if test -z "$leaveDotGit"; then
echo "removing \`.git'..." >&2
find "$dir" -name .git -print0 | xargs -0 rm -rf
else
find "$dir" -name .git | while read -r gitdir; do
make_deterministic_repo "$(readlink -f "$gitdir/..")"
done
fi
}
exit_handlers=()
run_exit_handlers() {
exit_status=$?
for handler in "${exit_handlers[@]}"; do
eval "$handler $exit_status"
done
}
trap run_exit_handlers EXIT
quiet_exit_handler() {
exec 2>&3 3>&-
if [ $1 -ne 0 ]; then
cat "$errfile" >&2
fi
rm -f "$errfile"
}
quiet_mode() {
errfile="$(mktemp "${TMPDIR:-/tmp}/git-checkout-err-XXXXXXXX")"
exit_handlers+=(quiet_exit_handler)
exec 3>&2 2>"$errfile"
}
json_escape() {
local s="$1"
s="${s//\\/\\\\}" # \
s="${s//\"/\\\"}" # "
s="${s//^H/\\\b}" # \b (backspace)
s="${s//^L/\\\f}" # \f (form feed)
s="${s//
/\\\n}" # \n (newline)
s="${s//^M/\\\r}" # \r (carriage return)
s="${s// /\\t}" # \t (tab)
echo "$s"
}
print_results() {
hash="$1"
if ! test -n "$QUIET"; then
echo "" >&2
echo "git revision is $fullRev" >&2
if test -n "$finalPath"; then
echo "path is $finalPath" >&2
fi
echo "git human-readable version is $humanReadableRev" >&2
echo "Commit date is $commitDate" >&2
if test -n "$hash"; then
echo "hash is $hash" >&2
fi
fi
if test -n "$hash"; then
cat <<EOF
{
"url": "$(json_escape "$url")",
"rev": "$(json_escape "$fullRev")",
"date": "$(json_escape "$commitDateStrict8601")",
"path": "$(json_escape "$finalPath")",
"$(json_escape "$hashType")": "$(json_escape "$hash")",
"fetchSubmodules": $([[ -n "$fetchSubmodules" ]] && echo true || echo false),
"deepClone": $([[ -n "$deepClone" ]] && echo true || echo false),
"leaveDotGit": $([[ -n "$leaveDotGit" ]] && echo true || echo false)
}
EOF
fi
}
remove_tmpPath() {
rm -rf "$tmpPath"
}
if test -n "$QUIET"; then
quiet_mode
fi
if test -z "$branchName"; then
branchName=fetchgit
fi
if test -n "$builder"; then
test -n "$out" -a -n "$url" -a -n "$rev" || usage
mkdir -p "$out"
clone_user_rev "$out" "$url" "$rev"
else
if test -z "$hashType"; then
hashType=sha256
fi
# If the hash was given, a file with that hash may already be in the
# store.
if test -n "$expHash"; then
finalPath=$(nix-store --print-fixed-path --recursive "$hashType" "$expHash" "$(url_to_name "$url" "$rev")")
if ! nix-store --check-validity "$finalPath" 2> /dev/null; then
finalPath=
fi
hash=$expHash
fi
# If we don't know the hash or a path with that hash doesn't exist,
# download the file and add it to the store.
if test -z "$finalPath"; then
tmpPath="$(mktemp -d "${TMPDIR:-/tmp}/git-checkout-tmp-XXXXXXXX")"
exit_handlers+=(remove_tmpPath)
tmpFile="$tmpPath/$(url_to_name "$url" "$rev")"
mkdir -p "$tmpFile"
# Perform the checkout.
clone_user_rev "$tmpFile" "$url" "$rev"
# Compute the hash.
hash=$(nix-hash --type $hashType --base32 "$tmpFile")
# Add the downloaded file to the Nix store.
finalPath=$(nix-store --add-fixed --recursive "$hashType" "$tmpFile")
if test -n "$expHash" -a "$expHash" != "$hash"; then
echo "hash mismatch for URL \`$url'. Got \`$hash'; expected \`$expHash'." >&2
exit 1
fi
fi
print_results "$hash"
if test -n "$PRINT_PATH"; then
echo "$finalPath"
fi
fi

350
pkgs/pluto/packages.nix Normal file
View file

@ -0,0 +1,350 @@
# This file is autogenerated, do not edit by hand!
{fetchgit}: {
registryUrl = "https://github.com/JuliaRegistries/General.git";
registryRev = "c67828a86f7501f4d487607ded065dbff37ee456";
registrySha256 = "1jsg9wf6gwfaswld0d6mm9g8i4v2nrd7265a2an9xh3zdk1blcny";
rootPackages = ["Pluto"];
closure = [{
name = "Artifacts";
uuid = "56f22d72-fd6d-98f1-02f0-08ddc0907c33";
path = "A/Artifacts";
replaceUrlInManifest = null;
treehash = "c30985d8821e0cd73870b17b0ed0ce6dc44cb744";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaPackaging/Artifacts.jl.git"; rev = "c30985d8821e0cd73870b17b0ed0ce6dc44cb744"; sha256 = "0i0s26ypiwg6zyb3aqn9kiyiblkkab1mfac9plmq2jv4hggm4jfc"; };
} {
name = "Base64";
uuid = "2a0f44e3-6c83-55bd-87e4-b1978d98bd5f";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Configurations";
uuid = "5218b696-f38b-4ac9-8b61-a12ec717816d";
path = "C/Configurations";
replaceUrlInManifest = null;
treehash = "b8486a417456d2fbbe2af13e24cef459c9f42429";
artifacts = {};
src = fetchgit { url = "https://github.com/Roger-luo/Configurations.jl.git"; rev = "b8486a417456d2fbbe2af13e24cef459c9f42429"; sha256 = "1dz1h64nqgcv6ai70pfv2dv4mqx9rqmh08196k7j73bqlc6r00w1"; };
} {
name = "Crayons";
uuid = "a8cc5b0e-0ffa-5ad4-8c14-923d3ee1735f";
path = "C/Crayons";
replaceUrlInManifest = null;
treehash = "3f71217b538d7aaee0b69ab47d9b7724ca8afa0d";
artifacts = {};
src = fetchgit { url = "https://github.com/KristofferC/Crayons.jl.git"; rev = "3f71217b538d7aaee0b69ab47d9b7724ca8afa0d"; sha256 = "0v3zhjlnb2914bxcj4myl8pgb7m31p77aj2k1bckmqs96jdph10z"; };
} {
name = "DataAPI";
uuid = "9a962f9c-6df0-11e9-0e5d-c546b8b5ee8a";
path = "D/DataAPI";
replaceUrlInManifest = null;
treehash = "dfb3b7e89e395be1e25c2ad6d7690dc29cc53b1d";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaData/DataAPI.jl.git"; rev = "dfb3b7e89e395be1e25c2ad6d7690dc29cc53b1d"; sha256 = "14sfvkz169zcbap3gdwpj16qsap783h86fd07flfxk822abam11w"; };
} {
name = "DataValueInterfaces";
uuid = "e2d170a0-9d28-54be-80f0-106bbe20a464";
path = "D/DataValueInterfaces";
replaceUrlInManifest = null;
treehash = "bfc1187b79289637fa0ef6d4436ebdfe6905cbd6";
artifacts = {};
src = fetchgit { url = "https://github.com/queryverse/DataValueInterfaces.jl.git"; rev = "bfc1187b79289637fa0ef6d4436ebdfe6905cbd6"; sha256 = "0g2wj6q7jj956nx6g7dk8x7w1c4l2xcmnr1kq5x8s8fild9kslg8"; };
} {
name = "Dates";
uuid = "ade2ca70-3891-5945-98fb-dc099432e06a";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Distributed";
uuid = "8ba89e20-285c-5b6f-9357-94700520ee1b";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "ExproniconLite";
uuid = "55351af7-c7e9-48d6-89ff-24e801d99491";
path = "E/ExproniconLite";
replaceUrlInManifest = null;
treehash = "c97ce5069033ac15093dc44222e3ecb0d3af8966";
artifacts = {};
src = fetchgit { url = "https://github.com/Roger-luo/ExproniconLite.jl.git"; rev = "c97ce5069033ac15093dc44222e3ecb0d3af8966"; sha256 = "0qk73k71c6v0vsq705mmxfj4vg3qslppxy5c8magaf7v5yr605iv"; };
} {
name = "FuzzyCompletions";
uuid = "fb4132e2-a121-4a70-b8a1-d5b831dcdcc2";
path = "F/FuzzyCompletions";
replaceUrlInManifest = null;
treehash = "9cde086faa37f32794be3d2df393ff064d43cd66";
artifacts = {};
src = fetchgit { url = "https://github.com/JunoLab/FuzzyCompletions.jl.git"; rev = "9cde086faa37f32794be3d2df393ff064d43cd66"; sha256 = "07sv88c472n6w4x7diy952igbcfm1s104ysnnvprld83312siw06"; };
} {
name = "HTTP";
uuid = "cd3eb016-35fb-5094-929b-558a96fad6f3";
path = "H/HTTP";
replaceUrlInManifest = null;
treehash = "b855bf8247d6e946c75bb30f593bfe7fe591058d";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaWeb/HTTP.jl.git"; rev = "b855bf8247d6e946c75bb30f593bfe7fe591058d"; sha256 = "10m7sqzm06c6pkn885gf6bjnbx4m8hcgy8lyzv15arlssrdracad"; };
} {
name = "IniFile";
uuid = "83e8ac13-25f8-5344-8a64-a9f2b223428f";
path = "I/IniFile";
replaceUrlInManifest = null;
treehash = "098e4d2c533924c921f9f9847274f2ad89e018b8";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaIO/IniFile.jl.git"; rev = "098e4d2c533924c921f9f9847274f2ad89e018b8"; sha256 = "19cn41w04hikrqdzlxhrgf21rfqhkvj9x1zvwh3yz9hqbf350xs9"; };
} {
name = "InteractiveUtils";
uuid = "b77e0a4c-d291-57a0-90e8-8db25a27a240";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "IteratorInterfaceExtensions";
uuid = "82899510-4779-5014-852e-03e436cf321d";
path = "I/IteratorInterfaceExtensions";
replaceUrlInManifest = null;
treehash = "a3f24677c21f5bbe9d2a714f95dcd58337fb2856";
artifacts = {};
src = fetchgit { url = "https://github.com/queryverse/IteratorInterfaceExtensions.jl.git"; rev = "a3f24677c21f5bbe9d2a714f95dcd58337fb2856"; sha256 = "1slpay1dhja8f9gy6z7b3psgvgcknn963dvfqqakvg1grk9ppa09"; };
} {
name = "JLLWrappers";
uuid = "692b3bcd-3c85-4b1f-b108-f13ce0eb3210";
path = "J/JLLWrappers";
replaceUrlInManifest = null;
treehash = "642a199af8b68253517b80bd3bfd17eb4e84df6e";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaPackaging/JLLWrappers.jl.git"; rev = "642a199af8b68253517b80bd3bfd17eb4e84df6e"; sha256 = "0v7xhsv9z16d657yp47vgc86ggc01i1wigqh3n0d7i1s84z7xa0h"; };
} {
name = "LibGit2";
uuid = "76f85450-5226-5b5a-8eaa-529ad045b433";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Libdl";
uuid = "8f399da3-3557-5675-b5ff-fb832c97cbdb";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "LinearAlgebra";
uuid = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Logging";
uuid = "56ddb016-857b-54e1-b83d-db4d58db5568";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Markdown";
uuid = "d6f4376e-aef5-505a-96c1-9c027394607a";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "MbedTLS";
uuid = "739be429-bea8-5141-9913-cc70e7f3736d";
path = "M/MbedTLS";
replaceUrlInManifest = null;
treehash = "1c38e51c3d08ef2278062ebceade0e46cefc96fe";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaLang/MbedTLS.jl.git"; rev = "1c38e51c3d08ef2278062ebceade0e46cefc96fe"; sha256 = "0zjzf2r57l24n3k0gcqkvx3izwn5827iv9ak0lqix0aa5967wvfb"; };
} {
name = "MbedTLS_jll";
uuid = "c8ffd9c3-330d-5841-b78e-0817d7145fa1";
path = "M/MbedTLS_jll";
replaceUrlInManifest = null;
treehash = "0eef589dd1c26a3ac9d753fe1a8bcad63f956fa6";
artifacts = {
"519367e9365948074c1fcc9f4365597f147a5ab7" = [{
name = "MbedTLS";
url = "https://github.com/JuliaBinaryWrappers/MbedTLS_jll.jl/releases/download/MbedTLS-v2.16.8+0/MbedTLS.v2.16.8.x86_64-linux-gnu.tar.gz";
sha256 = "5968d98ac1d4fdbf44ed87b0687f916f69ab077961b3cc9ea6068e2d739bd953";
}];
};
src = fetchgit { url = "https://github.com/JuliaBinaryWrappers/MbedTLS_jll.jl.git"; rev = "0eef589dd1c26a3ac9d753fe1a8bcad63f956fa6"; sha256 = "0x43cp26p4w799i1cy4j72l5b1dyqcsab98qjw6yydxk2wha5vw4"; };
} {
name = "MsgPack";
uuid = "99f44e22-a591-53d1-9472-aa23ef4bd671";
path = "M/MsgPack";
replaceUrlInManifest = null;
treehash = "a8cbf066b54d793b9a48c5daa5d586cf2b5bd43d";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaIO/MsgPack.jl.git"; rev = "a8cbf066b54d793b9a48c5daa5d586cf2b5bd43d"; sha256 = "1layiqjf9si38pfdcszppgcy4zbfqgld7jlw8x645sm9b17b19fg"; };
} {
name = "NetworkOptions";
uuid = "ca575930-c2e3-43a9-ace4-1e988b2c1908";
path = "N/NetworkOptions";
replaceUrlInManifest = null;
treehash = "ed3157f48a05543cce9b241e1f2815f7e843d96e";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaLang/NetworkOptions.jl.git"; rev = "ed3157f48a05543cce9b241e1f2815f7e843d96e"; sha256 = "02nm4v67lb1dzhgyyr9bg8kylyyhsgfsxf3nxhbl4hp9dz48an0a"; };
} {
name = "OrderedCollections";
uuid = "bac558e1-5e72-5ebc-8fee-abe8a469f55d";
path = "O/OrderedCollections";
replaceUrlInManifest = null;
treehash = "85f8e6578bf1f9ee0d11e7bb1b1456435479d47c";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaCollections/OrderedCollections.jl.git"; rev = "85f8e6578bf1f9ee0d11e7bb1b1456435479d47c"; sha256 = "0jaxcmvkp8zpqrz101yikdigz90s70i7in5wn8kybwzf0na3lhwf"; };
} {
name = "Pkg";
uuid = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Pluto";
uuid = "c3e4b0f8-55cb-11ea-2926-15256bba5781";
path = "P/Pluto";
replaceUrlInManifest = null;
treehash = "85156b21dee3a4515ff479555eb958ad33c057aa";
artifacts = {};
src = fetchgit { url = "https://github.com/fonsp/Pluto.jl.git"; rev = "85156b21dee3a4515ff479555eb958ad33c057aa"; sha256 = "1dvgrj0likniafs06hrwfndbshqr5khdqdyylganc1m81652rz5x"; };
} {
name = "Preferences";
uuid = "21216c6a-2e73-6563-6e65-726566657250";
path = "P/Preferences";
replaceUrlInManifest = null;
treehash = "00cfd92944ca9c760982747e9a1d0d5d86ab1e5a";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaPackaging/Preferences.jl.git"; rev = "00cfd92944ca9c760982747e9a1d0d5d86ab1e5a"; sha256 = "1cail43iqzbi6m9v6981rhz47zf2lcvhs5ds5gdqvc9nx5frghxq"; };
} {
name = "Printf";
uuid = "de0858da-6303-5e67-8744-51eddeeeb8d7";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "REPL";
uuid = "3fa0cd96-eef1-5676-8a61-b3b8758bbffb";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Random";
uuid = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "SHA";
uuid = "ea8e919c-243c-51af-8825-aaa63cd721ce";
path = "S/SHA";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Serialization";
uuid = "9e88b42a-f829-5b0c-bbe9-9e923198166b";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Sockets";
uuid = "6462fe0b-24de-5631-8697-dd941f90decc";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "TOML";
uuid = "fa267f1f-6049-4f14-aa54-33bafae1ed76";
path = "T/TOML";
replaceUrlInManifest = null;
treehash = "44aaac2d2aec4a850302f9aa69127c74f0c3787e";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaLang/TOML.jl.git"; rev = "44aaac2d2aec4a850302f9aa69127c74f0c3787e"; sha256 = "1xy19fc5lrj9kh298xhczvmjl1cx3p46fj2xlmkisaqxzhd782yd"; };
} {
name = "TableIOInterface";
uuid = "d1efa939-5518-4425-949f-ab857e148477";
path = "T/TableIOInterface";
replaceUrlInManifest = null;
treehash = "9a0d3ab8afd14f33a35af7391491ff3104401a35";
artifacts = {};
src = fetchgit { url = "https://github.com/lungben/TableIOInterface.jl.git"; rev = "9a0d3ab8afd14f33a35af7391491ff3104401a35"; sha256 = "0p2fi9jbyfg2j6rysv4if7dx8qw2mssb04i75j1zq607j8707kvn"; };
} {
name = "TableTraits";
uuid = "3783bdb8-4a98-5b6b-af9a-565f29a5fe9c";
path = "T/TableTraits";
replaceUrlInManifest = null;
treehash = "c06b2f539df1c6efa794486abfb6ed2022561a39";
artifacts = {};
src = fetchgit { url = "https://github.com/queryverse/TableTraits.jl.git"; rev = "c06b2f539df1c6efa794486abfb6ed2022561a39"; sha256 = "08ssb2630wm6j8f2qa985mn2vfibfm5kjcn4ayl2qkhfcyp8daw4"; };
} {
name = "Tables";
uuid = "bd369af6-aec1-5ad0-b16a-f7cc5008161c";
path = "T/Tables";
replaceUrlInManifest = null;
treehash = "c9d2d262e9a327be1f35844df25fe4561d258dc9";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaData/Tables.jl.git"; rev = "c9d2d262e9a327be1f35844df25fe4561d258dc9"; sha256 = "1q0wh4031zdp40k44jaix19pzy6cnwsa2p0zfz6799jbyqkg4kr1"; };
} {
name = "Test";
uuid = "8dfed614-e22c-5e08-85e1-65c5234f0b40";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "URIs";
uuid = "5c2747f8-b7ea-4ff2-ba2e-563bfd36b1d4";
path = "U/URIs";
replaceUrlInManifest = null;
treehash = "97bbe755a53fe859669cd907f2d96aee8d2c1355";
artifacts = {};
src = fetchgit { url = "https://github.com/JuliaWeb/URIs.jl.git"; rev = "97bbe755a53fe859669cd907f2d96aee8d2c1355"; sha256 = "0kp4hg3kknkm2smlcizqfd33l9x4vkahc2714gnbjp39fj285b92"; };
} {
name = "UUIDs";
uuid = "cf7118a7-6976-5b1a-9a39-7adc72f591a4";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
} {
name = "Unicode";
uuid = "4ec0a83e-493e-50e2-b9ac-8f72acf5a8f5";
path = "null";
replaceUrlInManifest = null;
treehash = "None";
artifacts = {};
src = null;
}];
}

View file

@ -0,0 +1,15 @@
using Pkg
Pkg.activate(".")
using Pluto
Pluto.run(
launch_browser = false,
require_secret_for_access = false,
require_secret_for_open_links = false,
port = 9999,
host = "127.0.0.1",
notebook_path_suggestion = "/notebooks"
)

66
services/pluto.nix Normal file
View file

@ -0,0 +1,66 @@
{ config, lib, pkgs, ... }:
let sources = import ../nix/sources.nix;
in
{
containers.pluto = {
autoStart = true;
bindMounts."/notebooks" = {
hostPath = "/data/pluto";
isReadOnly = false;
};
config = {pkgs, config, ...}: {
systemd.services.pluto =
let
julia = (import ../pkgs/pluto)
{pkgs = import sources.nixpkgs {};};
pluto = pkgs.stdenv.mkDerivation {
name = "pluto-standalone";
buildPhase = "mkdir $out";
installPhase = ''
cp *.toml $out
cp *.jl $out
'';
src = ../pkgs/pluto;
};
in {
enable = true;
description = "Pluto.js notebook server";
wantedBy = [ "multi-user.target" ];
serviceConfig = {
type = "simple";
User = "pluto";
Group = "pluto";
};
# julia needs some writable directory to keep state in
# (especially precompiled artifacts). The wrapped version
# of julia below will append this with a path from the
# nix store that contains all needed packages, so this
# should even work entirely without internet access.
environment.JULIA_DEPOT_PATH = "/var/lib/julia";
script = ''
cd ${pluto.outPath}
${julia}/bin/julia pluto-standalone.jl
'';
};
users.users.pluto = {
group = "pluto";
home = "/notebooks";
isSystemUser = true;
};
users.groups.pluto = {};
systemd.tmpfiles.rules = [
"d /var/lib/julia 0750 pluto pluto"
];
};
};
systemd.services."container@pluto".serviceConfig = {
MemoryHigh = "2G"; # will throttle, but not a hard limit
MemoryMax = "2.5G"; # hard limit
CPUQuota = "100%"; # give CPU time roughly equivalent to one core
};
}