Compare commits

..

4 Commits

Author SHA1 Message Date
xenia f5b82de4f1 update lix 2024-11-13 01:09:51 -05:00
xenia 2451f14ba7 update upstream 2024-11-13 00:48:28 -05:00
xenia 95738f98e7 fix non-flake default.nix usage with extra args 2024-11-13 00:47:30 -05:00
xenia 6dae849266 pin nixpkgs 24.05 2024-10-21 21:44:33 -04:00
128 changed files with 2606 additions and 6417 deletions

695
README.md
View File

@ -6,139 +6,165 @@ include_toc: true
# dragnpkgs # dragnpkgs
this is my personal nixos modules and packages repository. while it was designed for my own use, this is my personal nixos modules and packages repository. while it was designed for my own use,
it's also intended to be flexible and reusable enough for general purpose usage it's also intended to be flexible and reusable enough for general purpose usage. i might consider
upstreaming into nixpkgs if there is sufficient interest
dragnpkgs provides the following
- a set of package definitions, in `pkgs/`, which provide packages not in `nixpkgs`, some of my own
libraries and utilities, and rewrites/patches of upstream packages to suit my needs
- the top level overlay is located in `overlay.nix`, in a similar style as nixpkgs
`all-packages.nix`
- a set of nixos modules, in `modules/`
- a module including all of the other modules is located at `module.nix`
- utilities, in `lib/` and contained within `flake.nix`
- flake templates, in `templates/`
- a full wrapper around `nixpkgs` which includes the package set and nixos modules by default, and
changes the default nix implementation to `lix`, so this repo can be used in place of the
`nixpkgs` flake
## licensing
this repository is NOT licensed under a "standard" FOSS license. instead, it uses [CC-BY-NC-SA
4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.en). this means, in particular that
commercial use is forbidden. if you are, for whatever reason, interested in using this code
commercially, please contact me
additionally, several package definitions included in this repo point to packages which have their
own noteworthy licensing (including, for example, unfree and non-redistributable game server
software). make sure you are following the license requirements, which can be found in
`meta.license` for each package
## usage ## usage
since i use flakes now (sigh!!!) i'm not supporting non-flake usage anymore. if you read the files dragnpkgs provides a set of nixos modules and a nixpkgs overlay containing custom packages. the
in the repo there's a way to do it probably modules require the overlay
for flake usage, add this repo as an input and don't input nixpkgs at all, since we fully wrap it ### non-flake
```nix
{config, lib, pkgs, ...}:
{
imports = [
/path/to/dragnpkgs/module.nix
];
nixpkgs.overlays = [ (import /path/to/dragnpkgs/overlay.nix) ];
}
```
for standalone nix on other distros, use `~/.config/nixpkgs/overlays.nix` to enable the dragnpkgs
overlay
```nix
[ (import <dragnpkgs/overlay.nix>) ]
```
### flake
for flake usage, point your `nixpkgs` to this repo
```nix ```nix
{ {
inputs = { inputs = {
# for nixos-25.05 # for nixos-24.05
dragnpkgs.url = "git+https://git.lain.faith/haskal/dragnpkgs.git?ref=nixos-25.05"; nixpkgs.url = "git+https://git.lain.faith/haskal/dragnpkgs.git?ref=nixos-24.05";
# for nixos-unstable # for nixos-unstable
dragnpkgs.url = "git+https://git.lain.faith/haskal/dragnpkgs.git?ref=main"; nixpkgs.url = "git+https://git.lain.faith/haskal/dragnpkgs.git?ref=main";
};
outputs = { self, dragnpkgs, ... }: {
nixosConfigurations.mycomputer = dragnpkgs.lib.nixosSystem {
...
};
}; };
} }
``` ```
note that the dragnpkgs module sets a couple defaults -- see module.nix and the inline modules in note that overriding inputs to the flake won't necessarily work because of the way nixpkgs registers
flake.nix for details itself with the system. this requires really annoying hacks to get working at all. if you want to
- disables nixpkgs self-registration in the flake registry and nix path and enables a depend on `dragnpkgs` with a different version of `nixpkgs` (ie not 24.05 or unstable), clone the
dragnpkgs-specific registration mechanism for these that is enabled by default, see repo and recreate `flake.lock`. aren't flakes so cool and fun!!!!
`options.dragnpkgs`
- in flake.nix but not in module.nix: disable channels
- enable experimental features `nix-command flakes repl-flake`
- disable the default flake registry. i think it's cringe
- add a repl overlay that adds some useful utilities to `nix repl` -- see repl-overlay.nix for
details
- provides a flake pure eval mode bypass via a lix plugin for allowlisting certain unfree licenses
that can be enabled when the user has permission to use packages with those licenses. this allows
usage of those packages without needing to set `NIXPKGS_ALLOW_UNFREE=1` and passing `--impure`,
which i find very clunky
also note that overriding inputs to the flake won't necessarily work because of the way nixpkgs ## options documentation
registers itself with the system. this requires really annoying hacks to get working at all. if you
want to depend on `dragnpkgs` with a different version of `nixpkgs` (ie not 24.11 or unstable),
clone the repo and recreate `flake.lock`. aren't flakes so cool and fun!!!!
## flake lib documentation documentation for options provided by dragnpkgs
These utilities are provided by the dragnpkgs flake.
### `dragnpkgs.lib.mkFlake attrs` ### [`services.ghidra-server`](./modules/ghidra-server)
the shared project server for [ghidra](https://ghidra-sre.org)
This provides a small utility for defining flakes in a way that avoids some of the pain related to
flake attributes being keyed by `system`. `attrs` is an attribute set similar to what would normally
be returned for `outputs`, but the keys `packages`, `legacyPackages`, `devShells`, and `apps` are
written in `callPackage` style
For example:
example usage:
```nix ```nix
outputs = { self, dragnpkgs }: dragnpkgs.lib.mkFlake { services.ghidra-server = {
devShells.default = { enable = true;
mkShell, host = "your.domain.or.ip";
hello,
}: mkShell {
packages = [
hello
];
};
}; };
``` ```
Currently there is no mechanism to access `system`-keyed attributes from another `system`-keyed #### services.ghidra-server.enable
attribute, so it must be done manually using `system` in the arguments to the `callPackage`-style
function. For example:
```nix enables the ghidra server service
outputs = { self, dragnpkgs }: dragnpkgs.lib.mkFlake {
packages.default = {
stdenv,
mydependency,
}: stdenv.mkDerivation {
pname = "mypackage";
version = "DEV";
src = ./.; #### services.ghidra-server.enableAdminCli
buildInputs = [ mydependency ]; adds a system package for the CLI tool `ghidra-svrAdmin`, which allows anyone in the `ghidra` group
}; to administer the server (this corresponds to the `server/svrAdmin` tool in the stock ghidra
distribution)
devShells.default = { #### services.ghidra-server.{package, jdkPackage} (`ghidra_headless`, `openjdk17_headless`)
mkShell,
system, allows overriding the ghidra package and jdk package used for the server
}: mkShell {
packages = [ #### services.ghidra-server.host
self.packages.${system}.default
]; the server hostname or IP; this is typically required (by java RMI) for correct operation
};
}; #### services.ghidra-server.basePort (`13100`)
the server will use 3 consecutive TCP ports starting from this port
#### services.ghidra-server.directory (`ghidra-server`)
the root directory for server files, as a subdirectory of `/var/lib`. this is needed because this
option is passed to systemd `StateDirectory=`
#### services.ghidra-server.{user,group} (`ghidra`)
the service user and group
### more coming soon(tm)
## packages documentation
### [`ghidra_headless`](./default.nix)
a variant of ghidra built with a headless openjdk, intended to reduce closure size for server
operation
### [`ghidra`](./pkgs/ghidra-xenia/build.nix)
preview version of ghidra with my nix patches
### [`kicad`](./pkgs/kicad-xenia/default.nix)
preview version of kicad with my patches
### [`ocamlPackages.ppx_unicode`](./pkgs/ocaml/ppx_unicode)
opinionated ppx for string literals: <https://git.lain.faith/haskal/ppx_unicode>
### [`ocamlPackages.xlog`](./pkgs/ocaml/xlog)
logging for cats, in ocaml: <https://git.lain.faith/haskal/xlog>
### [`python312Packages.feedvalidator` or `feedvalidator`](./pkgs/python/feedvalidator)
the W3C atom/RSS feed validator library, <https://github.com/w3c/feedvalidator>
this package comes with an additional CLI bin, `feedvalidator`, which is a simple wrapper around the
library that enables CLI usage
usage
```
usage: feedvalidator [-h] [-b BASE] file
W3C feedvalidator
positional arguments:
file File to validate
options:
-h, --help show this help message and exit
-b BASE, --base BASE Base URL of document
``` ```
Future work is planned to make this easier. example
```bash
feedvalidator --base "https://my-base-url/atom.xml" path/to/atom.xml
```
### [`outer-wilds-text-adventure`](./pkgs/games/outer-wilds-text-adventure)
nix packaging for the Outer Wilds text adventure game. it should work by default on NixOS. if using
the nix package manager on a non-NixOS computer, you also need the following when using pipewire or
another ALSA plugin that lives in a separate package
```bash
export ALSA_PLUGIN_DIR=$(nix eval -f '<nixpkgs>' --raw pipewire)/lib/alsa-lib
```
## lib documentation ## lib documentation
These utilities are provided by the dragnpkgs overlay
### [`fetchFromSteam`](./lib/fetchsteam) ### [`fetchFromSteam`](./lib/fetchsteam)
a fetcher that downloads binaries from [Steam](https://store.steampowered.com/) using a fetcher that downloads binaries from [Steam](https://store.steampowered.com/) using
@ -168,8 +194,7 @@ pkgs.fetchFromSteam {
### [`fetchb4`](./lib/fetchb4) ### [`fetchb4`](./lib/fetchb4)
A fetcher that uses `b4` to download patchsets from <https://lore.kernel.org> so that they can be A fetcher that uses `b4` to download patchsets from <https://lore.kernel.org> so that they can be applied in `boot.kernelPatches`
applied in `boot.kernelPatches`
Usage: Usage:
@ -184,8 +209,7 @@ pkgs.fetchb4 {
} }
``` ```
note that not specifying a version may make cause future invocations to return different output if a note that not specifying a version may make cause future invocations to return different output if a newer version is sent to the thread
newer version is sent to the thread
### [`mkNginxServer`](./lib/dev-nginx) ### [`mkNginxServer`](./lib/dev-nginx)
@ -204,6 +228,24 @@ pkgs.mkNginxServer {
} }
``` ```
### [`gitSource`](./lib/git-source)
for development package nix files, computes the source set of files tracked by git at the given root
path
arguments:
- `root`: the root of the git repo, where `.git` is located
- `subdir`, optional: a subdirectory within the git repo. if provided, only files in this
subdirectory will go into the final source set
example:
```nix
stdenv.mkDerivation {
# ...
src = gitSource { root = ./.; };
}
```
### [`makeSquashFs`](./lib/make-squashfs) ### [`makeSquashFs`](./lib/make-squashfs)
builds a squashfs image from the given derivations builds a squashfs image from the given derivations
@ -221,429 +263,20 @@ makeSquashFs {
create a packaged nix distribution with the given packages in it for weird HPC systems. go read the create a packaged nix distribution with the given packages in it for weird HPC systems. go read the
source to find out what it does; i don't recommend using this if you're not me source to find out what it does; i don't recommend using this if you're not me
### [`instrumentedFetch`](./overlay.nix) ## development
overrides the given fetch derivation (eg `fetchzip` or `fetchgit`) and logs the hash of the result. structure of this repo
this enables automatically determining and filling in the hash value when initially developing the - `default.nix`: the top level NixOS module, which can also be interpreted as a plain nix file
nix expression for a package. the log will contain text in the format outside of NixOS for access to just the nixpkgs overlay. this contains all definitions for
`FETCH_HASH:<hash>:FETCH_HASH`. packages, library functions, and NixOS modules
- `lib/`: library functions (ie functions that get added to the overlay) go here
### [`lib.licenses.fyptl`](./lib/licenses/fyptl.nix) - `modules/`: NixOS modules go here
- `pkgs/`: packages that get added to the overlay go here
The "Fuck You, Pirate This License" (FYPTL) is the author's version of a software non-license, which - `support/`: WIP support tools (eg generating documentation)
explicitly does not grant any rights to use, modify, or redistribute a given piece of software, but
does disclaim warranty. ## licensing
## nixos options documentation this repository is NOT licensed under a "standard" FOSS license. instead, it uses
[CC-BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/deed.en). this means, in
documentation for nixos options provided by dragnpkgs particular that commercial use is forbidden. if you are, for whatever reason, interested in using
this code commercially, please contact me
### [`dragnpkgs`](./flake.nix)
options for configuring dragnpkgs
### [`dragnpkgs.setFlakeRegistry`](./flake.nix) (`true`)
Set flake registry option pointing to self
### [`dragnpkgs.setNixPath`](./flake.nix) (`true`)
Set nix path entry pointing to self
### [`dragnpkgs.setNixpkgsFlakeAlias`](./flake.nix) (`true`)
Set flake registry entry for `nixpkgs` to self
### [`dragnpkgs.setTemplatesFlakeAlias`](./flake.nix) (`true`)
Set flake registry entry for `templates` to self
### [`dragnpkgs.possiblyCommitCrimes`](./flake.nix) (`false`)
Globally enable usage of packages marked as FYPTL. This installs a nix plugin, which is widely
considered to be a nix crime, and it also might be an actual crime to use these packages depending
on you jurisdiction. Use at your own risk
### [`services.ghidra-server`](./modules/ghidra-server)
the shared project server for [ghidra](https://github.com/NationalSecurityAgency/ghidra)
example usage:
```nix
services.ghidra-server = {
enable = true;
host = "your.domain.or.ip";
};
```
##### development notes
the module does the following:
- sets up unix permissions on the ghidra repositories location that allows anyone in the `ghidra`
group to run `ghidra-svrAdmin` to perform admin tasks
- only supports basic username/password authentication for the time being
- parses the classpath file for the ghidra server which is normally read by the launcher, and uses
it to launch the server directly, without using the launcher. this was done because the launcher
was doing several things that were unwanted / could be better handled by systemd and journald, and
it was complicated to turn them off. this also allows us to customize the jvm args more easily
- provides a log4j configuration that causes all logs to be sent to the system journal. this
effectively disables any ghidra-server-specific logfile management
- sets the most basic isolation parameters (`PrivateTmp=true` and `NoNewPrivileges=true`), but more
work could be done to secure the ghidra server service
#### `services.ghidra-server.enable`
enables the ghidra server service
#### `services.ghidra-server.enableAdminCli` (`true`)
adds a system package for the CLI tool `ghidra-svrAdmin`, which allows anyone in the `ghidra` group
to administer the server (this corresponds to the `server/svrAdmin` tool in the stock ghidra
distribution)
#### `services.ghidra-server.{package, jdkPackage}` (`ghidra_headless`, `openjdk21_headless`)
allows overriding the ghidra package and jdk package used for the server
#### `services.ghidra-server.host`
the server hostname or IP; this is typically required (by java RMI) for correct operation
#### `services.ghidra-server.basePort` (`13100`)
the server will use 3 consecutive TCP ports starting from this port
#### `services.ghidra-server.directory` (`ghidra-server`)
the root directory for server files, as a subdirectory of `/var/lib`. this is needed because this
option is passed to systemd `StateDirectory=`
#### `services.ghidra-server.{user,group}` (`ghidra`)
the service user and group
### [`programs.ghidra`](./modules/ghidra-client/default.nix)
like upstream, but patches an issue with loading python packages in the ghidra debug feature
additionally, provides a way to specify extensions
#### `programs.ghidra.extensions`
Ghidra extensions to be included in the installation.
example:
```
[ (ps: with ps; [ binsync ]) ]
```
#### `programs.ghidra.binsync.enable`
enable binsync integration
### [`programs.idapro`](./modules/idapro/default.nix)
Enables IDA Pro in the system environment, with optional plugin config.
This also directs IDA Pro to use `~/.config/idapro` as its main user config directory, instead of
`~/.idapro`. Unfortunately, as of IDA Pro 9.2, `~/.idapro` still gets created, though it will be
empty.
#### `programs.idapro.enable`
Whether to enable IDA Pro
#### `programs.idapro.package`
The IDA Pro package to use
#### `programs.idapro.binsync.enable`
Enables binsync integration with IDA Pro
### [`environment.machineInfo`](./modules/machine-info/default.nix)
provides options to customize the `/etc/machine-info` file on a NixOS system. See the module itself
and <https://www.freedesktop.org/software/systemd/man/latest/machine-info.html> for more info
### [`services.satisfactory`](./modules/satisfactory-dedicated-server/default.nix)
The dedicated server for the game [Satisfactory](https://satisfactorygame.com)
This module provides the needed runtime environment for the dedicated server to run on NixOS, as
well as settings which can be automatically applied to provision the server on the first start (eg
server name, admin password). This provisioning needs to be done at runtime, due to the way the
server works, but it will be performed securely, before the server is exposed to the network for the
first time. This means you can safely deploy the server to the public internet without worrying
about exposing the "unclaimed" initial server mode, where any user could gain full privileges.
##### development notes
this module does the following:
- sets up `satisfactory.service` with some systemd isolation options and
notably, a private mount namespace in which the nix store path for the server
is mounted together with some overmounts for read-write directories within
the installation. this allows the software to "write to its own install
directory" which is required for operation. the real location of the written
files is in `/var/lib/satisfactory`
- if certs are provided, the systemd credentials mechanism is used to make them
available to the server process. another bind overmount is used to put the
credentials dir in the place that the server binary expects. additionally,
`satisfactory-restart-certs.service` is configured to restart the dedicated
server whenever the cert is renewed
- when the first-run options are specified,
`satisfactory-first-time-setup.service` is configured as a dependency with a
condition on the data file the server uses to store its settings. if the file
exists, the first-run setup is skipped. in this service,
`PrivateNetwork=true` is used to isolate the service from the network while a
bash script executes HTTP API calls to perform the requested setup. once this
is done, the server is shut down and execution will proceed to the main
`satisfactory.service`
this is mostly still in line with [a blog post i wrote on the
topic](https://blog.awoo.systems/posts/2024-01-12-going-win32-scale-packaging-the-satisfactory-dedicated-server-on-nixos)
but there have been some changes since then that are not reflected in the post
#### `services.satisfactory.enable`
enables the satisfactory dedicated server service
#### `services.satisfactory.package` (`pkgs.satisfactory-dedicated-server`)
the package to use for the service
#### `services.satisfactory.directory` (`"/var/lib/satisfactory"`)
Directory where Satisfactory Dedicated Server data will be stored
#### `services.satisfactory.{user,group}` (`"satisfactory"`)
User account and group under which Satisfactory Dedicated Server runs
#### `services.satisfactory.useACMEHost` (`null`)
If set, the server will use the ACME-provided TLS certificate for the given host.
Note that this module does not actually provision the specified certificate; you must use additional
config (e.g., `services.nginx.virtualHosts.<name>.enableACME = true`) to provision the certificate
using a supported ACME method.
#### `services.satisfactory.port` (`7777`)
Server port number (TCP/UDP)
This corresponds to the `-Port` command line option.
#### `services.satisfactory.reliablePort` (`8888`)
Server reliable port number
This corresponds to the `-ReliablePort` command line option.
#### `services.satisfactory.externalReliablePort` (`null`)
Server reliable port number as seen outside NAT.
This corresponds to the `-ExternalReliablePort` command line option.
#### `services.satisfactory.disableSeasonalEvents` (`false`)
Whether to run the server with seasonal events disabled.
This corresponds to the `-DisableSeasonalEvents` command line option.
#### `services.satisfactory.extraIniOptions` (`{}`)
Run the server with additional ini configuration values.
This is a nested attribute set of values.
- The top level attribute specifies the ini file containing the value to set (i.e., the
first component of the `-ini` command line option), for example `Game` or `Engine`.
- The secondary level attribute specifies the ini file category, without brackets,
for example `/Script/Engine.GameSession`.
- The final level attribute specifies the option name to set, for example
`MaxPlayers`. The value of the attribute is the value to set on the command line.
This corresponds to the `-ini` command line option.
#### `services.satisfactory.initialSettings`
Settings to apply to the server via the server API on the first run.
#### `services.satisfactory.initialSettings.serverName` (`null`)
The name of the server.
If this is provided, `adminPasswordFile` must also be set.
#### `services.satisfactory.initialSettings.adminPasswordFile` (`null`)
Path to a file containing the initial admin password.
If this is provided, `serverName` must also be set.
#### `services.satisfactory.initialSettings.clientPasswordFile` (`null`)
Path to a file containing the initial client password. If not set, the server will
not be configured with a client password and will be accessible to any client.
### [`hardware.wirelessRegulatoryDomain`](./modules/regdom/default.nix)
The wireless regulatory domain to set in the kernel `cfg80211` module. This defaults to `"00"`
(international), but more bands (such as 6GHz, on supported hardware) can be enabled by setting this
to the jurisdiction in which the machine is located, for example `"US"`.
## packages documentation
### [`ghidra`](./pkgs/reverse-engineering/ghidra/build.nix)
a version of ghidra that uses a split derivation, `lib` contains the core ghidra distribution, `doc`
contains all the documentation elements, and `out` contains the bin folder, icons, and desktop file.
only `out` has a dependency on the build jdk, so `lib` and `doc` can be used with reduced closure
size
### [`ghidra_headless`](./pkgs/reverse-engineering/ghidra/build.nix)
a variant of ghidra which does not have a dependency on any jdk, intended to reduce closure size for
server operation with a headless jdk (in particular, the ghidra-server nixos module uses
`ghidra_headless` with `openjdk21_headless` by default
this is equivalent to the `lib` output of the split `ghidra` package
### [`ghidra-extensions`](./pkgs/reverse-engineering/ghidra/extensions)
like upstream, but contains additional extensions:
- `binsync-ghidra`: the binsync `ghidra_scripts` installation packaged as an extension, so it can be
installed at the system level
### [`ocamlPackages.ppx_unicode`](./pkgs/ocaml/ppx_unicode)
opinionated ppx for string literals: <https://git.lain.faith/haskal/ppx_unicode>
### [`ocamlPackages.xlog`](./pkgs/ocaml/xlog)
logging for cats, in ocaml: <https://git.lain.faith/haskal/xlog>
### [`ocamlPackages.systemd-ml`](./pkgs/ocaml/systemd-ml)
libsystemd implementation in native ocaml: <https://git.lain.faith/haskal/systemd-ml>
### [`ocamlPackages.ocaml-manual`](./pkgs/ocaml/ocaml-manual)
the ocaml html docs package from opam
### [`ocamlPackages.patdiff-bin`](./pkgs/ocaml/patdiff-bin)
a repackaged version of `ocamlPackages.patdiff` with a reduced closure size
### [`python312Packages.feedvalidator` or `feedvalidator`](./pkgs/python/feedvalidator)
the W3C atom/RSS feed validator library, <https://github.com/w3c/feedvalidator>
this package comes with an additional CLI bin, `feedvalidator`, which is a simple wrapper around the
library that enables CLI usage
usage
```
usage: feedvalidator [-h] [-b BASE] file
W3C feedvalidator
positional arguments:
file File to validate
options:
-h, --help show this help message and exit
-b BASE, --base BASE Base URL of document
```
example
```bash
feedvalidator --base "https://my-base-url/atom.xml" path/to/atom.xml
```
### [`python312Packages.megacom` or `megacom`](./pkgs/python/megacom)
a python utility to access serial ports from the command line
### [`python311Packages.binsync` and `python311Packages.libbs`](./pkgs/reverse-engineering/binsync)
packaged latest versions of binsync and libbs from git
### [`outer-wilds-text-adventure`](./pkgs/games/outer-wilds-text-adventure)
nix packaging for the Outer Wilds text adventure game. it should work by default on NixOS. if using
the nix package manager on a non-NixOS computer, you also need the following when using pipewire or
another ALSA plugin that lives in a separate package
```bash
export ALSA_PLUGIN_DIR=$(nix eval -f '<nixpkgs>' --raw pipewire)/lib/alsa-lib
```
### [`racket`, `racket-minimal`, `racketPackages`](./pkgs/racket)
dragnpkgs contains a slightly customized version of racket and racket-minimal (to include some minor
bugfixes that are pending upstream stable release).
additionally, a new scope `racketPackages` provides some packages from
<https://pkgs.racket-lang.org>, automatically converted from their catalog information and
`info.rkt` by [racket2nix](https://git.lain.faith/haskal/racket-nix). see the readme on that repo
for information on how to use `buildRacketPackage` and `makeRacketEnv`
### [`satisfactory-dedicated-server`](./pkgs/games/satisfactory-dedicated-server)
The dedicated server for [Satisfactory](https://satisfactorygame.com), with packaging steps to make
it run correctly on NixOS. This must be used together with the NixOS module
(`services.satisfactory`), which sets up the environment needed for the server to execute.
See
[`services.satisfactory`](#services-satisfactory-modules-satisfactory-dedicated-server-default-nix)
for further info and development notes
### [`eta`](./pkgs/cmdline/eta)
Generic tool for monitoring ETA and progress of an arbitrary process.
<https://github.com/aioobe/eta>
### [`zbasefind`](./pkgs/rust/zbasefind)
Command line tool to guess the base address of a raw firmware binary (zoomer edition).
<https://git.lain.faith/haskal/zbasefind.git>
### [`cado-nfs`](./pkgs/crypto/cado-nfs)
Cado-NFS, An Implementation of the Number Field Sieve Algorithm
<https://gitlab.inria.fr/cado-nfs/cado-nfs>
### [`lix-plugins`](./pkgs/lix/lix-plugins)
A plugin module for lix which provides the flake pure eval bypass which can be enabled using the
dragnpkgs flake.
### [`zfs_2_3`](./pkgs/zfs/)
A version of ZFS with a patch for the `zed` userspace daemon to enable desktop notifications on ZFS
errors. This makes ZFS a bit more reasonable to run on GUI systems
### [`pympress`](./pkgs/python/pympress)
A version of [pympress](https://cimbali.github.io/pympress/) with a patch to fix the window icon on
KDE
### [`texliveDragonPackages.moloch`](./pkgs/tex/moloch)
A version of the [moloch](https://jolars.co/blog/2024-05-30-moloch/) beamer theme with [some
patches](https://git.lain.faith/haskal/moloch-dragon/) to make it easier to use with pympress and
fix an issue with appendix slide numbering
### [`idapro`](./pkgs/reverse-engineering/idapro9/default.nix)
Nix packaging for IDA Pro (see the file for details on how to use it)

17
TODO.md
View File

@ -1,12 +1,13 @@
# TODO # TODO
## `ghidra` ## ghidra-xenia-v2
- split ghidra-lib package with no dependency on any jdk (launch.sh)
- make a custom launch script similar to ghidra-server nixos module to replace launch.sh
- make runtime jdk configurable and independent of build jdk
- adding a doc split output with ghidra javadoc/sleigh doc/etc would be nice
- wrap/expose pyghidraRun so it works correctly ## upstream
- export passthru python packages - fix kicad desktop file name
- pyghidra packages
- type stubs
- lower priority: gdb, lldb
## `ghidra-server` ## `ghidra-server`
@ -15,3 +16,7 @@ create NixOS VM test
- test that ghidra-svrAdmin works as an unprivileged user in the `ghidra` group - test that ghidra-svrAdmin works as an unprivileged user in the `ghidra` group
- possibly test remotely importing a binary. however, ghidra-svrAdmin working is a good indicator of - possibly test remotely importing a binary. however, ghidra-svrAdmin working is a good indicator of
the server being functional the server being functional
## general
- meta info / license info for pkgs

View File

@ -14,9 +14,9 @@ let
}; };
in in
{ overlays ? [], ... } @ args: { overlays ? [], ... } @ args:
import "${nixpkgs}" ({ import "${nixpkgs}" {
overlays = [ overlays = [
(import ./overlay.nix) (import ./overlay.nix)
(import "${lix-module}/overlay.nix" { inherit lix; }) (import "${lix-module}/overlay.nix" { inherit lix; })
] ++ overlays; ] ++ overlays;
} // (builtins.removeAttrs args [ "overlays" ])) } // (builtins.removeAttrs args [ "overlays" ])

View File

@ -1,37 +1,52 @@
{ {
"nodes": { "nodes": {
"lix-module": { "lix": {
"flake": false, "flake": false,
"locked": { "locked": {
"lastModified": 1756125859, "lastModified": 1729298361,
"narHash": "sha256-6a+PWILmqHCs9B5eIBLg6HSZ8jYweZpgOWO8FlyVwYI=", "narHash": "sha256-hiGtfzxFkDc9TSYsb96Whg0vnqBVV7CUxyscZNhed0U=",
"rev": "d3292125035b04df00d01549a26e948631fabe1e", "rev": "ad9d06f7838a25beec425ff406fe68721fef73be",
"type": "tarball", "type": "tarball",
"url": "https://git.lix.systems/api/v1/repos/lix-project/nixos-module/archive/d3292125035b04df00d01549a26e948631fabe1e.tar.gz?rev=d3292125035b04df00d01549a26e948631fabe1e" "url": "https://git.lix.systems/api/v1/repos/lix-project/lix/archive/ad9d06f7838a25beec425ff406fe68721fef73be.tar.gz?rev=ad9d06f7838a25beec425ff406fe68721fef73be"
}, },
"original": { "original": {
"type": "tarball", "type": "tarball",
"url": "https://git.lix.systems/lix-project/nixos-module/archive/2.93.3-2.tar.gz" "url": "https://git.lix.systems/lix-project/lix/archive/2.91.1.tar.gz"
}
},
"lix-module": {
"flake": false,
"locked": {
"lastModified": 1729360442,
"narHash": "sha256-6U0CyPycIBc04hbYy2hBINnVso58n/ZyywY2BD3hu+s=",
"rev": "9098ac95768f7006d7e070b88bae76939f6034e6",
"type": "tarball",
"url": "https://git.lix.systems/api/v1/repos/lix-project/nixos-module/archive/9098ac95768f7006d7e070b88bae76939f6034e6.tar.gz?rev=9098ac95768f7006d7e070b88bae76939f6034e6"
},
"original": {
"type": "tarball",
"url": "https://git.lix.systems/lix-project/nixos-module/archive/2.91.1-1.tar.gz"
} }
}, },
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1759831965, "lastModified": 1731239293,
"narHash": "sha256-vgPm2xjOmKdZ0xKA6yLXPJpjOtQPHfaZDRtH+47XEBo=", "narHash": "sha256-q2yjIWFFcTzp5REWQUOU9L6kHdCDmFDpqeix86SOvDc=",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "c9b6fb798541223bbb396d287d16f43520250518", "rev": "9256f7c71a195ebe7a218043d9f93390d49e6884",
"type": "github" "type": "github"
}, },
"original": { "original": {
"owner": "NixOS", "owner": "NixOS",
"ref": "nixos-unstable", "ref": "nixos-24.05",
"repo": "nixpkgs", "repo": "nixpkgs",
"type": "github" "type": "github"
} }
}, },
"root": { "root": {
"inputs": { "inputs": {
"lix": "lix",
"lix-module": "lix-module", "lix-module": "lix-module",
"nixpkgs": "nixpkgs" "nixpkgs": "nixpkgs"
} }

222
flake.nix
View File

@ -2,81 +2,27 @@
description = "dragnpkgs together with nixpkgs and lix"; description = "dragnpkgs together with nixpkgs and lix";
inputs = { inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable"; nixpkgs.url = "github:NixOS/nixpkgs/nixos-24.05";
lix-module = { lix-module = {
url = "https://git.lix.systems/lix-project/nixos-module/archive/2.93.3-2.tar.gz"; url = "https://git.lix.systems/lix-project/nixos-module/archive/2.91.1-1.tar.gz";
flake = false;
};
lix = {
url = "https://git.lix.systems/lix-project/lix/archive/2.91.1.tar.gz";
flake = false; flake = false;
}; };
}; };
outputs = { self, nixpkgs, lix-module }: outputs = { self, nixpkgs, lix, lix-module }:
let let
overlays = [ overlays = [
(import ./overlay.nix) (import ./overlay.nix)
(import "${lix-module}/overlay.nix" { lix = null; }) (import "${lix-module}/overlay.nix" { inherit lix; })
]; ];
forAllSystems = nixpkgs.lib.genAttrs nixpkgs.lib.systems.flakeExposed; forAllSystems = nixpkgs.lib.genAttrs nixpkgs.lib.systems.flakeExposed;
libVersionInfoOverlay = import "${nixpkgs}/lib/flake-version-info.nix" nixpkgs;
# this is taken from upstream. if upstream changes, the code here needs to be updated to match
addLibVersionInfo = lib: lib.extend libVersionInfoOverlay;
lib-base = addLibVersionInfo (import "${nixpkgs}/lib");
in { in {
# we don't just use nix.registry.whatever.flake = self lib = nixpkgs.lib.extend (final: prev: {
# the reason for this is to be able to handle a flake.lock containing an entry for this
# flake -- setting .flake makes it a path entry in the registry, whereas we want our
# self reference in the registry to be downloadable by URL in case it makes it into a
# flake.lock
meta.registry-entry = {
from = { id = "dragnpkgs-unstable"; type = "indirect"; };
to = {
type = "git";
url = "https://git.lain.faith/haskal/dragnpkgs.git";
ref = "main";
} // self.lib.filterAttrs
(n: _: n == "lastModified" || n == "rev" || n == "revCount" || n == "narHash")
self;
};
# the nix path entry for self
meta.path-entry = "dragnpkgs-unstable=flake:dragnpkgs-unstable";
lib = (lib-base.extend (import ./lib/overlay.nix)).extend (final: prev: {
# initializes regular upstream nixpkgs with the given arguments
nixpkgs-custom = { system, ... } @ args: (
(import "${nixpkgs}" args).extend (final: prev: {
lib = addLibVersionInfo prev.lib;
})
);
# initializes dragnpkgs with its overlays and default config using the given arguments
dragnpkgs-custom = { system, ... } @ args: let
unsafeConf = if builtins.hasAttr "extraBuiltins" builtins then (
let conf = builtins.extraBuiltins; in
if builtins.isAttrs conf then conf else {}
) else {};
possiblyCommitCrimes =
if
builtins.hasAttr "dragnpkgs" unsafeConf &&
builtins.isAttrs unsafeConf.dragnpkgs &&
builtins.hasAttr "possiblyCommitCrimes" unsafeConf.dragnpkgs &&
builtins.isBool unsafeConf.dragnpkgs.possiblyCommitCrimes
then
unsafeConf.dragnpkgs.possiblyCommitCrimes
else
false;
in
final.nixpkgs-custom (args // {
overlays = overlays ++ (args.overlays or []);
config = (args.config or {}) // {
allowlistedLicenses = (final.optionals
possiblyCommitCrimes
[ final.licenses.fyptl ]) ++ (args.config.allowlistedLicenses or []);
};
});
nixos = import "${nixpkgs}/nixos/lib" { lib = final; };
nixosSystem = args: nixosSystem = args:
import "${nixpkgs}/nixos/lib/eval-config.nix" ( import "${nixpkgs}/nixos/lib/eval-config.nix" (
{ {
@ -86,161 +32,19 @@
modules = args.modules ++ [ modules = args.modules ++ [
({ config, pkgs, lib, ... }: { ({ config, pkgs, lib, ... }: {
config.nixpkgs = { config.nixpkgs.flake.source = self.outPath;
# we remove nixpkgs' machinery for setting self flake references and config.nixpkgs.overlays = overlays;
# replace it with our own (in the next inline module)
flake = {
source = self.outPath;
setNixPath = false;
setFlakeRegistry = false;
};
overlays = overlays;
};
# this is in the flake rather than in module.nix so there's still control over
# channels if you're not using a flake based config. but for flake based
# configs, we're not doing channels anymore
config.nix = {
channel.enable = false;
};
}) })
({ options, config, pkgs, lib, ...}: {
options.dragnpkgs = {
setFlakeRegistry = lib.mkOption {
description = "Set flake registry option pointing to self";
type = lib.types.bool;
default = true;
defaultText = lib.literalExpression "true";
example = lib.literalExpression "false";
};
setNixPath = lib.mkOption {
description = "Set nix path entry pointing to self";
type = lib.types.bool;
default = true;
defaultText = lib.literalExpression "true";
example = lib.literalExpression "false";
};
setNixpkgsFlakeAlias = lib.mkOption {
description = "Set flake registry entry for `nixpkgs` to self";
type = lib.types.bool;
default = true;
defaultText = lib.literalExpression "true";
example = lib.literalExpression "false";
};
setTemplatesFlakeAlias = lib.mkOption {
description = "Set flake registry entry for `templates` to self";
type = lib.types.bool;
default = true;
defaultText = lib.literalExpression "true";
example = lib.literalExpression "false";
};
possiblyCommitCrimes = lib.mkOption {
description = ''
Globally enable usage of packages marked as FYPTL. This installs a nix
plugin, which is widely considered to be a nix crime, and it also might
be an actual crime to use these packages depending on you jurisdiction. Use
at your own risk
'';
type = lib.types.bool;
default = false;
defaultText = lib.literalExpression "false";
example = lib.literalExpression "false";
};
};
config.nix.registry.dragnpkgs-unstable =
lib.mkIf config.dragnpkgs.setFlakeRegistry self.meta.registry-entry;
config.nix.registry.nixpkgs = lib.mkIf config.dragnpkgs.setNixpkgsFlakeAlias {
from = { id = "nixpkgs"; type = "indirect"; };
to = { id = "dragnpkgs-unstable"; type = "indirect"; };
};
config.nix.registry.templates = lib.mkIf config.dragnpkgs.setTemplatesFlakeAlias {
from = { id = "templates"; type = "indirect"; };
to = { id = "dragnpkgs-unstable"; type = "indirect"; };
};
config.nix.nixPath = lib.mkIf config.dragnpkgs.setNixPath [
self.meta.path-entry
];
config.nixpkgs.config = lib.mkIf config.dragnpkgs.possiblyCommitCrimes {
allowlistedLicenses = [ lib.licenses.fyptl ];
};
config.nix.settings.plugin-files =
lib.optionals config.dragnpkgs.possiblyCommitCrimes [
"${pkgs.lix-plugins}/lib/liblix-plugins.so"
];
config.nix.settings.extra-builtins-file =
lib.mkIf config.dragnpkgs.possiblyCommitCrimes (
lib.mkForce "/etc/nix/extra-builtins.nix"
);
config.environment.etc = lib.mkIf config.dragnpkgs.possiblyCommitCrimes {
"nix/extra-builtins.nix".text =
let
possiblyCommitCrimes =
lib.boolToString config.dragnpkgs.possiblyCommitCrimes;
in ''
{ ... }: {
dragnpkgs = {
possiblyCommitCrimes = ${possiblyCommitCrimes};
};
}
'';
};
})
(import ./module.nix) (import ./module.nix)
]; ];
} // builtins.removeAttrs args [ "modules" ] } // builtins.removeAttrs args [ "modules" ]
); );
mkFlake = flakeDef:
let
rewritePerSystem = sectionDef: (forAllSystems (system:
builtins.mapAttrs (name: value:
if final.isDerivation value then
value
else
self.legacyPackages.${system}.callPackage value {}
) sectionDef
));
in
builtins.mapAttrs (name: value:
if name == "packages" || name == "legacyPackages" || name == "devShells" ||
name == "apps" then
rewritePerSystem value
else
value
) flakeDef;
}); });
legacyPackages = forAllSystems (system: legacyPackages = forAllSystems (system:
self.lib.dragnpkgs-custom { inherit system; } nixpkgs.legacyPackages.${system}.appendOverlays overlays
); );
templates = { nixosModules = nixpkgs.nixosModules;
default = {
path = ./templates/default;
description = "A very basic flake (with dragnpkgs)";
};
beamer = {
path = ./templates/beamer;
description = "A very basic presentation with Beamer";
};
};
defaultTemplate = self.templates.default;
}; };
} }

View File

@ -0,0 +1,14 @@
{ lib }: { root, subdir ? null }:
let
fs = lib.fileset;
sourceFiles = fs.difference
(fs.gitTracked root)
(fs.fileFilter (file: file.hasExt "nix") root);
finalSourceFiles =
if subdir == null then
sourceFiles
else
fs.intersection sourceFiles subdir;
finalRoot = if subdir == null then root else subdir;
in
fs.toSource { root = finalRoot; fileset = finalSourceFiles; }

View File

@ -1,17 +0,0 @@
The Fuck You, Pirate This License (FYPTL)
---
Copyright (c) 2024 [Copyright Holder(s)]. All Rights Reserved.
Permission to use, copy, modify, and/or distribute this software IS NOT granted
for any purpose. Performing any such actions in connection with this software
may constitute copyright infringement, and the copyright holder(s) may pursue
any remedies for such infringement in accordance with applicable law.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
PERFORMANCE OF THIS SOFTWARE.

View File

@ -1,8 +0,0 @@
{
shortName = "FYPTL";
fullName = "Fuck You, Pirate This License";
deprecated = false;
free = false;
redistributable = false;
url = "https://git.lain.faith/haskal/dragnpkgs/src/branch/main/lib/licenses/FYPTL";
}

View File

@ -25,7 +25,7 @@ let
base-container = runCommand "empty.sif.d" { base-container = runCommand "empty.sif.d" {
buildInputs = [ coreutils ]; buildInputs = [ coreutils ];
} '' } ''
mkdir -p "$out" mkdir "$out"
cd "$out" cd "$out"
mkdir -p proc sys dev nix etc bin usr/bin .singularity.d mkdir -p proc sys dev nix etc bin usr/bin .singularity.d
ln -s /etc/sh bin/sh ln -s /etc/sh bin/sh
@ -44,7 +44,6 @@ let
mkdir -p /var/lib/singularity/mnt/session mkdir -p /var/lib/singularity/mnt/session
echo "root:x:0:0:System administrator:/root:/bin/sh" > /etc/passwd echo "root:x:0:0:System administrator:/root:/bin/sh" > /etc/passwd
echo > /etc/resolv.conf echo > /etc/resolv.conf
mkdir -p "$out"
${singularity}/bin/singularity build "$out/empty.sif" "container/" ${singularity}/bin/singularity build "$out/empty.sif" "container/"
''); '');
@ -68,7 +67,7 @@ let
base-etc = runCommand "singularity-etc" { base-etc = runCommand "singularity-etc" {
buildInputs = [ coreutils bash cacert ]; buildInputs = [ coreutils bash cacert ];
} '' } ''
mkdir -p "$out" mkdir "$out"
ln -s "${shell}/bin/startup.sh" "$out/runscript" ln -s "${shell}/bin/startup.sh" "$out/runscript"
ln -s "${bash}/bin/bash" "$out/sh" ln -s "${bash}/bin/bash" "$out/sh"
ln -s "${coreutils}/bin/env" "$out/env" ln -s "${coreutils}/bin/env" "$out/env"
@ -82,15 +81,13 @@ let
''; '';
squashfs = makeSquashFs { filename = "nix-store"; storeContents = [ shell ]; comp = "gzip"; }; squashfs = makeSquashFs { filename = "nix-store"; storeContents = [ shell ]; };
startCommand = writeText "run-container.sh" '' startCommand = writeText "run-container.sh" ''
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
if ! which singularity &>/dev/null; then module load singularity/3.10.3
module load singularity/3.10.3
fi
temp_dir="$(mktemp -d)" temp_dir="$(mktemp -d)"
mkdir -p "''${TMPDIR:-/tmp}/empty" mkdir -p "''${TMPDIR:-/tmp}/empty"
@ -110,15 +107,10 @@ let
cat /etc/localtime > $temp_dir/etc/localtime cat /etc/localtime > $temp_dir/etc/localtime
cat /etc/resolv.conf > $temp_dir/etc/resolv.conf cat /etc/resolv.conf > $temp_dir/etc/resolv.conf
workdir="/work" singularity run -B "/work:/work,/scratch:/scratch,$temp_dir/nix-store.squashfs:/nix/store:image-src=/,$temp_dir/etc:/etc" --pid --uts --ipc container-base.sif
if [ ! -d "/work" ]; then
workdir="/projects"
fi
singularity run -B "/$workdir:/$workdir,/scratch:/scratch,$temp_dir/nix-store.squashfs:/nix/store:image-src=/,$temp_dir/etc:/etc" --pid --uts --ipc container-base.sif
''; '';
in runCommand "hpc-files.d" {} '' in runCommand "hpc-files.d" {} ''
mkdir -p "$out" mkdir "$out"
cp "${squashfs}" "$out/nix-store.squashfs" cp "${squashfs}" "$out/nix-store.squashfs"
cp -r "${base-etc}" "$out/etc" cp -r "${base-etc}" "$out/etc"
cp "${container-image}/empty.sif" "$out/container-base.sif" cp "${container-image}/empty.sif" "$out/container-base.sif"

View File

@ -1,3 +0,0 @@
final: prev: {
licenses = prev.licenses // { fyptl = import ./licenses/fyptl.nix; };
}

View File

@ -1,23 +1,5 @@
{ ... }: { { ... }: {
imports = [ imports = [
./modules/ghidra-client
./modules/ghidra-server ./modules/ghidra-server
./modules/idapro
./modules/machine-info
./modules/regdom
./modules/satisfactory-dedicated-server
]; ];
# set some nix settings defaults
config.nix.settings = {
repl-overlays = [ ./repl-overlay.nix ];
experimental-features = "nix-command flakes pipe-operator";
temp-dir = "/var/tmp";
# we're disabling the default flake registry because i don't like it
flake-registry = "";
# sigh
use-xdg-base-directories = "true";
};
} }

View File

@ -1,73 +0,0 @@
{
config,
lib,
pkgs,
...
}:
let
cfg = config.programs.ghidra;
isSplit = lib.elem "lib" cfg.package.outputs;
libOutput = if isSplit then cfg.package.lib else cfg.package;
packageWithExts = cfg.package.withExtensions
(p: lib.concatMap (pl: pl p) cfg.extensions);
in
{
disabledModules = [ "programs/ghidra.nix" ];
options.programs.ghidra = {
enable = lib.mkEnableOption "Ghidra, a software reverse engineering (SRE) suite of tools";
gdb = lib.mkOption {
default = true;
type = lib.types.bool;
description = ''
Whether to add to gdbinit the python modules required to make Ghidra's debugger work.
'';
};
package = lib.mkPackageOption pkgs "ghidra" { example = "ghidra_headless"; };
extensions = lib.mkOption {
type = with lib.types; listOf (functionTo (listOf package));
default = [];
description = ''
Ghidra extensions to be included in the installation.
'';
example = lib.literalExpression "[ (ps: with ps; [ my_extension ]) ]";
};
binsync = {
enable = lib.mkEnableOption "Ghidra binsync integration";
};
};
config = lib.mkIf cfg.enable {
programs.ghidra.extensions = lib.mkIf (cfg.binsync.enable) [
(ps: [ ps.binsync ])
];
environment = {
systemPackages = [
packageWithExts
];
etc = lib.mkIf cfg.gdb {
"gdb/gdbinit.d/ghidra-modules.gdb".text = with pkgs.python3.pkgs; ''
python
import sys
[sys.path.append(p) for p in "${
(makePythonPath [
psutil
protobuf
])
}".split(":")]
sys.path.append("${libOutput}/lib/ghidra/Ghidra/Debug/Debugger-agent-gdb/pypkg/src")
sys.path.append("${libOutput}/lib/ghidra/Ghidra/Debug/Debugger-rmi-trace/pypkg/src")
end
'';
};
};
};
}

View File

@ -8,9 +8,9 @@ let
in { in {
options.services.ghidra-server = { options.services.ghidra-server = {
enable = mkEnableOption "ghidra-server"; enable = mkEnableOption "ghidra-server";
enableAdminCli = mkEnableOption "ghidra-svrAdmin" // { default = true; }; enableAdminCli = mkEnableOption "ghidra-svrAdmin";
package = mkPackageOption pkgs "ghidra_headless" { }; package = mkPackageOption pkgs "ghidra_headless" { };
jdkPackage = mkPackageOption pkgs "openjdk21_headless" { }; jdkPackage = mkPackageOption pkgs "openjdk17_headless" { };
host = mkOption { host = mkOption {
default = null; default = null;
defaultText = literalExpression "null"; defaultText = literalExpression "null";

View File

@ -1,44 +0,0 @@
{
config,
lib,
pkgs,
...
}:
let
cfg = config.programs.idapro;
binsyncPkg = pkgs.python311.pkgs.binsync;
binsyncPath = "${pkgs.python311.pkgs.binsync}/${pkgs.python311.sitePackages}";
idaproConfigured = cfg.package.override {
pythonDeps = lib.optionals cfg.binsync.enable [binsyncPkg];
plugins = lib.optionals cfg.binsync.enable [
(pkgs.runCommand "binsync-ida" {} ''
mkdir -p $out/plugins
cp ${binsyncPath}/binsync/binsync_plugin.py $out/plugins
'')
];
};
in
{
options.programs.idapro = {
enable = lib.mkEnableOption "IDA Pro";
package = lib.mkPackageOption pkgs "idapro" {
example = lib.literalExpression "idapro.override { ... }";
};
binsync = {
enable = lib.mkEnableOption "IDA binsync integration";
};
};
config = lib.mkIf cfg.enable {
environment = {
systemPackages = [
idaproConfigured
];
sessionVariables.IDAUSR = "$HOME/.config/idapro";
};
};
}

View File

@ -1,119 +0,0 @@
{ config, pkgs, lib, ... }: with lib; {
options.environment.machineInfo = mkOption {
description = lib.mdDoc ''
Machine metadata, including stylized hostname, computer icon, etc.
This module controls the options written to `/etc/machine-info`. For more
information, see [the freedesktop documentation][1].
[1]: https://www.freedesktop.org/software/systemd/man/machine-info.html
'';
default = {};
type = types.submodule { options = {
prettyHostname = mkOption {
description = lib.mdDoc ''
A pretty, human-readable hostname for this machine, potentially including
spaces, unicode, and emoji. If unset, this falls back to the network hostname
set in `networking.hostName`.
'';
type = with types; nullOr str;
default = null;
defaultText = literalExpression "null";
example = literalExpression "\"Jade's Laptop 💎\"";
};
iconName = mkOption {
description = lib.mdDoc ''
An XDG icon which should be associated with this machine. Some common choices
include: `"computer"`, `"phone"`, but a complete list of icons can be found in
the [XDG Icon Naming Spec][1].
If left unset, applications will typically default to `"computer"`.
[1]: https://specifications.freedesktop.org/icon-naming-spec/icon-naming-spec-latest.html
'';
type = with types; nullOr str;
default = null;
defaultText = literalExpression "null";
example = literalExpression "\"computer\"";
};
chassis = mkOption {
description = lib.mdDoc ''
The type of chassis this machine resides within. This is typically detected
automatically, but can be manually overridden here.
'';
type = with types; nullOr (enum [
"desktop"
"laptop"
"convertible"
"server"
"tablet"
"handset"
"watch"
"embedded"
"vm"
"container"
]);
default = null;
defaultText = literalExpression "null";
example = literalExpression "\"server\"";
};
deployment = mkOption {
description = lib.mdDoc ''
If this machine is part of a deployment environment / pipeline, this option can
be used to specify what environment/pipeline stage it manages.
Typically, but not necessarily, set to something like `"development"`,
`"integration"`, `"staging"`, or `"production"`.
'';
type = with types; nullOr str;
default = null;
defaultText = literalExpression "null";
example = literalExpression "\"production\"";
};
location = mkOption {
description = lib.mdDoc ''
A human-readable short description of the location of this machine.
This can be set to whatever has the most meaning for you, for example "Living
Room", "Left Rack, 2nd Shelf", or "Parishville, NY".
'';
type = with types; nullOr str;
default = null;
defaultText = literalExpression "null";
example = literalExpression "\"Bedroom\"";
};
extraOptions = mkOption {
description = lib.mdDoc ''
Extra variables to put in `/etc/machine-info`
'';
type = with types; attrsOf str;
default = {};
defaultText = literalExpression "{ }";
example = literalExpression "{ HARDWARE_VENDOR = \"Intel Corp.\" }";
};
};};
};
config.environment.etc.machine-info =
with config.environment.machineInfo;
let
rawShellVars = {
PRETTY_HOSTNAME = prettyHostname;
ICON_NAME = iconName;
CHASSIS = chassis;
DEPLOYMENT = deployment;
LOCATION = location;
} // extraOptions;
nonNullShellVars = attrsets.filterAttrs (k: v: v != null) rawShellVars;
in rec {
text = strings.toShellVars nonNullShellVars;
enable = builtins.stringLength text > 0;
};
}

View File

@ -1,17 +0,0 @@
{ config, pkgs, lib, ... }:
with lib;
let
cfg = config.hardware.wirelessRegulatoryDomain;
in {
options.hardware.wirelessRegulatoryDomain = mkOption {
description = "The wireless regulatory domain to set in the kernel cfg80211 module";
type = with types; nullOr str;
default = null;
defaultText = literalExpression "null";
example = literalExpression "\"US\"";
};
config.boot.extraModprobeConfig = mkIf (cfg != null) ''
options cfg80211 ieee80211_regdom=${cfg}
'';
}

View File

@ -1,421 +0,0 @@
{ config, lib, pkgs, ... }:
let
cfg = config.services.satisfactory;
in {
options.services.satisfactory = with lib; {
enable = mkEnableOption "satisfactory";
package = mkPackageOption pkgs "satisfactory-dedicated-server" {};
directory = mkOption {
description = ''
Directory where Satisfactory Dedicated Server data will be stored
'';
default = "/var/lib/satisfactory";
type = types.str;
example = literalExpression "\"/data/games/satisfactory\"";
};
user = mkOption {
description = "User account under which Satisfactory Dedicated Server runs.";
default = "satisfactory";
type = types.str;
example = literalExpression "\"satisfactory2\"";
};
group = mkOption {
description = "Group under which Satisfactory Dedicated Server runs.";
default = "satisfactory";
type = types.str;
example = literalExpression "\"satisfactory2\"";
};
useACMEHost = mkOption {
description = ''
If set, the server will use the ACME-provided TLS certificate for the given host.
Note that this module does not actually provision the specified certificate; you must
use additional config (e.g., `services.nginx.virtualHosts.<name>.enableACME = true`) to
provision the certificate using a supported ACME method.
'';
default = null;
type = types.nullOr types.str;
example = literalExpression "\"myserver.example\"";
};
port = mkOption {
description = ''
Server port number (TCP/UDP)
This corresponds to the `-Port` command line option.
'';
default = 7777;
type = types.port;
example = literalExpression "7778";
};
reliablePort = mkOption {
description = ''
Server reliable port number
This corresponds to the `-ReliablePort` command line option.
'';
default = 8888;
type = types.port;
example = literalExpression "8889";
};
externalReliablePort = mkOption {
description = ''
Server reliable port number as seen outside NAT.
This corresponds to the `-ExternalReliablePort` command line option.
'';
default = null;
type = types.nullOr types.port;
example = literalExpression "12345";
};
disableSeasonalEvents = mkOption {
description = ''
Whether to run the server with seasonal events disabled.
This corresponds to the `-DisableSeasonalEvents` command line option.
'';
default = false;
type = types.bool;
example = literalExpression "true";
};
extraIniOptions = mkOption {
description = ''
Run the server with additional ini configuration values.
This is a nested attribute set of values.
- The top level attribute specifies the ini file containing the value to set (i.e., the
first component of the `-ini` command line option), for example `Game` or `Engine`.
- The secondary level attribute specifies the ini file category, without brackets,
for example `/Script/Engine.GameSession`.
- The final level attribute specifies the option name to set, for example
`MaxPlayers`. The value of the attribute is the value to set on the command line.
This corresponds to the `-ini` command line option.
'';
default = {};
type = with types; attrsOf (attrsOf (attrsOf str));
example = literalExpression ''
{
Game."/Script/Engine.GameSession".MaxPlayers = "8";
}
'';
};
initialSettings = mkOption {
description = ''
Settings to apply to the server via the server API on the first run.
'';
type = types.submodule {
options = {
serverName = mkOption {
description = ''
The name of the server.
If this is provided, `adminPasswordFile` must also be set.
'';
type = with types; nullOr str;
default = null;
example = literalExpression "\"My Dedicated Server\"";
};
adminPasswordFile = mkOption {
description = ''
Path to a file containing the initial admin password.
If this is provided, `serverName` must also be set.
'';
type = with types; nullOr path;
default = null;
example = literalExpression "\"/var/lib/secrets/admin-password.txt\"";
};
clientPasswordFile = mkOption {
description = ''
Path to a file containing the initial client password. If not set, the server will
not be configured with a client password and will be accessible to any client.
'';
type = with types; nullOr path;
default = null;
example = literalExpression "\"/var/lib/secrets/client-password.txt\"";
};
};
};
};
};
config = lib.mkIf cfg.enable {
assertions = [
{
assertion = with cfg.initialSettings; (serverName == null) == (adminPasswordFile == null);
message = ''
When either of services.satisfactory.initialSettings.serverName or
services.satisfactory.initialSettings.adminPasswordFile are set, the other must also be
set. The dedicated server API requires configuring both options simultaneously.
'';
}
{
assertion = with cfg.initialSettings; (clientPasswordFile == null) || (serverName != null);
message = ''
Option services.satisfactory.initialSettings.clientPasswordFile is set, but there are
no options set for the initial server claim data (i.e., serverName and adminPasswordFile).
Setting a client password is not possible without executing a server claim.
'';
}
];
users.users."${cfg.user}" = {
isSystemUser = true;
home = cfg.directory;
group = cfg.group;
createHome = false;
};
users.groups."${cfg.group}" = {};
systemd.tmpfiles.settings."satisfactory" = let
default = {
inherit (cfg) user group;
mode = "0755";
};
in {
"${cfg.directory}".d = default;
"${cfg.directory}/saves".d = default;
"${cfg.directory}/settings".d = default;
"${cfg.directory}/settings/game".d = default;
"${cfg.directory}/settings/engine".d = default;
};
systemd.services = let
base_url = "https://127.0.0.1:${builtins.toString cfg.port}/api/v1/";
binary = "${cfg.directory}/server/Engine/Binaries/Linux/FactoryServer-Linux-Shipping";
ini_list = lib.flatten (
lib.mapAttrsToList (filename: fileOpts:
lib.mapAttrsToList (section: sectionOpts:
lib.mapAttrsToList (key: value:
" -ini:${filename}:[${section}]:${key}=${value}"
) sectionOpts
) fileOpts
) cfg.extraIniOptions
);
ini_args = lib.concatStringsSep " " ini_list;
port = with builtins;
"-Port=${toString cfg.port} -ReliablePort=${toString cfg.reliablePort}";
extport =
if cfg.externalReliablePort == null then
""
else
" -ExternalReliablePort=${builtins.toString cfg.externalReliablePort}";
seasonalEvts =
if cfg.disableSeasonalEvents then
" -DisableSeasonalEvents"
else
"";
args = "${port}${extport}${seasonalEvts}${ini_args}";
server_command = "${binary} FactoryGame ${args}";
doSetup = cfg.initialSettings.serverName != null;
commonConfig = {
after = ["network.target"] ++ lib.optionals (cfg.useACMEHost != null) [
"acme-finished-${cfg.useACMEHost}.target"
];
unitConfig = {
RequiresMountsFor = cfg.directory;
};
serviceConfig = {
Nice = "-5";
User = cfg.user;
Group = cfg.user;
WorkingDirectory = cfg.directory;
StandardOutput = "journal";
LoadCredential = lib.mkIf (cfg.useACMEHost != null) (let
certDir = config.security.acme.certs.${cfg.useACMEHost}.directory;
in [
"cert_chain.pem:${certDir}/fullchain.pem"
"private_key.pem:${certDir}/key.pem"
]);
ProtectSystem = true;
ProtectHome = true;
NoNewPrivileges = true;
# virtualize the file system to synthesize what is read only with what is dedicated server
# game state
PrivateTmp = true;
CacheDirectory = "satisfactory";
TemporaryFileSystem = [
"${cfg.directory}:ro"
];
BindReadOnlyPaths = [
"${cfg.package}/opt:${cfg.directory}/server"
];
BindPaths = [
"${cfg.directory}/saves:${cfg.directory}/.config/Epic"
"/var/cache/satisfactory:${cfg.directory}/server/FactoryGame/Intermediate"
"${cfg.directory}/settings/game:${cfg.directory}/server/FactoryGame/Saved"
"${cfg.directory}/settings/engine:${cfg.directory}/server/Engine/Saved"
] ++ lib.optionals (cfg.useACMEHost != null) [
"%d:${cfg.directory}/server/FactoryGame/Certificates"
];
Restart = "on-failure";
RestartSec = 60;
SuccessExitStatus = 143;
};
};
in {
"satisfactory" = lib.mkMerge [
commonConfig
{
description = "Satisfactory Dedicated Server";
wantedBy = [ "multi-user.target" ];
requires = lib.optionals doSetup ["satisfactory-first-time-setup.service"];
after = lib.optionals doSetup ["satisfactory-first-time-setup.service"];
serviceConfig = {
ExecStart = server_command;
};
}
];
"satisfactory-first-time-setup" = lib.mkIf doSetup (lib.mkMerge [
commonConfig
{
description = "Satisfactory Dedicated Server first-time setup";
path = with pkgs; [
curl
jq
];
unitConfig = {
ConditionPathExists =
"!${cfg.directory}/saves/FactoryGame/Saved/SaveGames/ServerSettings.${builtins.toString cfg.port}.sav";
};
serviceConfig = {
Type = "oneshot";
# isolate satisfactory during configuration
PrivateNetwork = true;
LoadCredential =
(lib.optionals (cfg.initialSettings.adminPasswordFile != null) [
"admin_password.txt:${cfg.initialSettings.adminPasswordFile}"
]) ++ (lib.optionals (cfg.initialSettings.clientPasswordFile != null) [
"client_password.txt:${cfg.initialSettings.clientPasswordFile}"
]);
};
script = ''
set -euo pipefail
set -m
echo Starting server...
${server_command} &
server_pid=$!
server_status=""
for i in {1..5}; do
server_status="$(curl -SsLk -XPOST -H "Content-Type: application/json" \
--data '{"function":"HealthCheck","data":{"clientCustomData":""}}' \
"${base_url}" | jq -r '.data.health' || true)"
if [ "$server_status" == "healthy" ]; then
break
fi
sleep 5
done
if [ "$server_status" != "healthy"; then
echo Server did not report healthy status in time
exit 1
fi
token="$(curl -SsLk -XPOST -H "Content-Type: application/json" \
--data '{"function":"PasswordlessLogin","data":{"MinimumPrivilegeLevel":"InitialAdmin"}}' \
"${base_url}" | jq -r '.data.authenticationToken')"
if [ "$token" == "null" ]; then
echo Server authentication failed
exit 2
fi
echo Executing server claim...
data="$(jq -n \
--arg "serverName" "${cfg.initialSettings.serverName}" \
--rawfile "password" "$CREDENTIALS_DIRECTORY/admin_password.txt" \
'{} |.function="ClaimServer" | .data.ServerName=$serverName | .data.AdminPassword=($password|rtrimstr("\n"))')"
new_token="$(curl -SsLk -XPOST -H "Content-Type: application/json" \
-H "Authorization: Bearer $token" \
--data "$data" \
"${base_url}" | jq -r '.data.authenticationToken')"
if [ "$new_token" == "null" ]; then
echo Server claim failed
exit 2
fi
token="$new_token"
if [ -f "$CREDENTIALS_DIRECTORY/client_password.txt" ]; then
echo Setting client password...
data="$(jq -n \
--rawfile "password" "$CREDENTIALS_DIRECTORY/client_password.txt" \
'{} |.function="SetClientPassword" | .data.Password=($password|rtrimstr("\n"))')"
result="$(curl -SsLk -XPOST -H "Content-Type: application/json" \
-H "Authorization: Bearer $token" \
--data "$data" \
"${base_url}" | jq -r '.data')"
if [ "$result" != "" ]; then
echo "Password set failed: $result"
exit 4
fi
fi
echo Setup complete
echo Stopping server...
kill -SIGTERM $server_pid
wait $server_pid
'';
}
]);
"satisfactory-restart-certs" = lib.mkIf (cfg.useACMEHost != null) {
description = "Restart Satisfactory Dedicated Server after cert provisioning";
wantedBy = ["acme-finished-${cfg.useACMEHost}.target"];
path = [config.systemd.package];
script = ''
systemctl try-restart satisfactory.service
'';
serviceConfig = {
Type = "simple";
};
};
};
};
}

View File

@ -1,99 +1,42 @@
final: prev: { final: prev: {
lib = prev.lib.extend (import ./lib/overlay.nix);
fetchFromSteam = prev.callPackage ./lib/fetchsteam {}; fetchFromSteam = prev.callPackage ./lib/fetchsteam {};
fetchb4 = prev.callPackage ./lib/fetchb4 {}; fetchb4 = prev.callPackage ./lib/fetchb4 {};
gitSource = prev.callPackage ./lib/git-source {};
makeSquashFs = prev.callPackage ./lib/make-squashfs {}; makeSquashFs = prev.callPackage ./lib/make-squashfs {};
makeHpcDist = final.callPackage ./lib/make-hpc-dist {}; makeHpcDist = final.callPackage ./lib/make-hpc-dist {};
instrumentedFetch = drv: drv.overrideAttrs (afinal: aprev: { ghidra_headless = prev.ghidra.override {
postFetch = (aprev.postFetch or "") + '' openjdk17 = prev.openjdk17_headless;
printf "FETCH_HASH:%s:FETCH_HASH" "$(\ };
${final.lib.getExe final.nix} --extra-experimental-features "nix-command" \
hash path --sri "$out")"
'';
});
ghidra_headless = final.ghidra.lib;
# stuff that tracks upstream # stuff that tracks upstream
ghidra = final.callPackage ./pkgs/reverse-engineering/ghidra/build.nix { ghidra = final.callPackage ./pkgs/ghidra-xenia/build.nix {
protobuf = final.protobuf_21; protobuf = final.protobuf_21;
}; };
ghidra-extensions = final.lib.recurseIntoAttrs (final.callPackage ./pkgs/reverse-engineering/ghidra/extensions.nix { }); ghidra-extensions = final.lib.recurseIntoAttrs (final.callPackage ./pkgs/ghidra-xenia/extensions.nix { });
ghidra-bin = final.callPackage ./pkgs/ghidra-xenia { };
kicad = final.callPackage ./pkgs/kicad-xenia { };
kicadAddons = final.lib.recurseIntoAttrs (final.callPackage ./pkgs/kicad-xenia/addons {});
# end stuff that tracks upstream # end stuff that tracks upstream
idapro = final.callPackage ./pkgs/reverse-engineering/idapro9 {};
ocamlPackages = prev.ocamlPackages.overrideScope (ofinal: oprev: { ocamlPackages = prev.ocamlPackages.overrideScope (ofinal: oprev: {
ppx_unicode = ofinal.callPackage ./pkgs/ocaml/ppx_unicode {}; ppx_unicode = ofinal.callPackage ./pkgs/ocaml/ppx_unicode {};
xlog = ofinal.callPackage ./pkgs/ocaml/xlog {}; xlog = ofinal.callPackage ./pkgs/ocaml/xlog {};
systemd-ml = ofinal.callPackage ./pkgs/ocaml/systemd-ml {};
ocaml-manual = ofinal.callPackage ./pkgs/ocaml/ocaml-manual {};
patdiff-bin = ofinal.callPackage ./pkgs/ocaml/patdiff-bin {};
}); });
python312 = prev.python312.override { python312Packages = prev.python312Packages.overrideScope (pfinal: pprev: {
packageOverrides = pfinal: pprev: { feedvalidator = pfinal.callPackage ./pkgs/python/feedvalidator {};
feedvalidator = pfinal.callPackage ./pkgs/python/feedvalidator {}; });
megacom = pfinal.callPackage ./pkgs/python/megacom {};
};
};
# add to top level because it has a binary # add to top level because it has a binary
feedvalidator = final.python312Packages.feedvalidator; feedvalidator = final.python312Packages.feedvalidator;
megacom = final.python312Packages.megacom;
python311 = prev.python311.override {
packageOverrides = pfinal: pprev: {
libbs = pfinal.callPackage ./pkgs/reverse-engineering/binsync/libbs.nix {};
binsync = pfinal.callPackage ./pkgs/reverse-engineering/binsync/binsync.nix {};
};
};
outer-wilds-text-adventure = prev.callPackage ./pkgs/games/outer-wilds-text-adventure {}; outer-wilds-text-adventure = prev.callPackage ./pkgs/games/outer-wilds-text-adventure {};
satisfactory-dedicated-server = prev.callPackage ./pkgs/games/satisfactory-dedicated-server {};
mkNginxServer = prev.callPackage ./lib/dev-nginx {}; mkNginxServer = prev.callPackage ./lib/dev-nginx {};
zbasefind = final.callPackage ./pkgs/rust/zbasefind {};
eta = prev.callPackage ./pkgs/cmdline/eta {};
cado-nfs = prev.callPackage ./pkgs/crypto/cado-nfs {};
lix-plugins = prev.callPackage ./pkgs/lix/lix-plugins {};
nix-plugins = builtins.throw "nix-plugins is not supported. see pkgs.lix-plugins";
zfs_2_3 = prev.zfs_2_3.overrideAttrs {
patches = [ ./pkgs/zfs/0001-ZED-add-support-for-desktop-notifications-D-Bus.patch ];
};
pympress = prev.pympress.overrideDerivation (oldAttrs: {
patches = [ ./pkgs/python/pympress/0001-Fix-KDE-window-icon.patch ];
});
texliveDragonPackages = {
moloch = prev.callPackage ./pkgs/tex/moloch {};
};
racket-minimal = final.callPackage ./pkgs/racket/racket/minimal.nix {};
racket = final.callPackage ./pkgs/racket/racket/package.nix {};
racketPackages = let
names = builtins.readDir ./pkgs/racket/racket-catalog |> final.lib.attrNames;
byName = self:
final.lib.map (name: {
inherit name;
value = self.callPackage ./pkgs/racket/racket-catalog/${name} {};
}) names |>
final.lib.listToAttrs;
in final.lib.makeScope final.newScope (self: {
racketInstallHook = self.callPackage ./pkgs/racket/racket-install-hook.nix {};
buildRacketPackage = self.callPackage ./pkgs/racket/build-racket-package.nix {};
makeRacketEnv = self.callPackage ./pkgs/racket/make-racket-env.nix {};
} // (byName self));
} }

View File

@ -1,109 +0,0 @@
{
lib,
concatText,
fetchzip,
stdenvNoCC,
writeText,
writeShellApplication,
resholve,
bash,
cacert,
coreutils,
pacman,
systemd,
zstd,
repos ? ["core" "community" "extra"],
keyring-version ? "20250123-1",
keyring-hash ? "sha256-JW3z8MHVecayQ3heLbhPB+rMCuZ3QsjAYiFnVNfUeH0=",
mirror ? "https://mirror.rackspace.com/archlinux/$repo/os/$arch",
}: rec {
keyring = (fetchzip.override { withUnzip = false; }) {
url = "${builtins.replaceStrings ["$repo" "$arch"] ["core" "x86_64"] mirror}/archlinux-keyring-${keyring-version}-any.pkg.tar.zst";
hash = keyring-hash;
nativeBuildInputs = [ zstd ];
stripRoot = false;
postFetch = ''
rm "$out"/.BUILDINFO "$out"/.INSTALL "$out"/.MTREE "$out"/.PKGINFO
mkdir "$out"/share/pacman -p
mv "$out"/usr/share/pacman/keyrings "$out"/share/pacman
rm -rf "$out"/usr
'';
};
pacman_conf_in =
writeText
"pacman-mirrors.conf"
(lib.strings.concatLines
(lib.map
(repo: ''
[${repo}]
Server = ${mirror}
'')
repos));
pacman_conf = concatText "pacman.conf" [ "${pacman}/etc/pacman.conf" pacman_conf_in ];
bootstrap = resholve.writeScriptBin "archlinux-bootstrap" {
interpreter = "${bash}/bin/bash";
inputs = [ coreutils pacman systemd ];
execer = [
"cannot:${pacman}/bin/pacman-key"
"cannot:${systemd}/bin/systemd-nspawn"
];
} ''
set -o errexit
set -o nounset
set -o pipefail
if [ $# -lt 1 ]; then
echo "usage: $0 [directory] [pkgs ...]"
exit 1
fi
newroot="$1"
shift
echo "Installing arch linux to $newroot"
# set up new base filesystem
install -dm0755 "$newroot"
install -dm0755 "$newroot"/var/{cache/pacman/pkg,lib/pacman,log}
install -dm0755 "$newroot"/{dev,run,etc/pacman.d}
install -dm1777 "$newroot"/tmp
install -dm0555 "$newroot"/{sys,proc}
# set up mountpoint for nix
install -dm0755 "$newroot"/nix
# temporarily set up /etc/mtab, pacman needs this to work
ln -sf /proc/mounts "$newroot"/etc/mtab
# fully initialize the keyring ahead of entering the container
pacman_conf="${pacman_conf}"
pacman-key --gpgdir "$newroot"/etc/pacman.d/gnupg --config "$pacman_conf" --init
pacman-key --gpgdir "$newroot"/etc/pacman.d/gnupg --config "$pacman_conf" \
--populate archlinux --populate-from "${keyring}/share/pacman/keyrings"
# install the config file
install -Dm0755 "$pacman_conf" "$newroot"/etc/pacman.conf
# bootstrap the system. allow pacman to overwrite the existing mtab entry
systemd-nspawn -D "$newroot" --bind-ro=/nix \
-E SSL_CERT_FILE=${cacert}/etc/ssl/certs/ca-bundle.crt \
-E PATH=/usr/bin/ \
-- \
"${pacman}/bin/pacman" -Sy --noconfirm --overwrite /etc/mtab base "$@"
# remove nix mount point
rmdir "$newroot"/nix
echo "Done installing!"
echo "Set root password:"
echo " sudo systemd-nspawn -UD \"$newroot\" -- /bin/passwd root"
echo "Boot system:"
echo " sudo systemd-nspawn -bUD \"$newroot\""
'';
}

View File

@ -1,28 +0,0 @@
{
fetchFromGitHub,
stdenv,
lib,
}:
stdenv.mkDerivation {
pname = "eta";
version = "git";
src = fetchFromGitHub {
owner = "aioobe";
repo = "eta";
rev = "938f16bd088ce3d2a6f1bafbcdfd9a60d4d671ea";
hash = "sha256-rTXy1K4oDM1/NC6qpunDlyrEFyk93hkowrriuXODCMg=";
};
PREFIX = "";
installPhase = ''
make DESTDIR="$out" install
'';
meta = {
description = "Generic tool for monitoring ETA and progress of an arbitrary process.";
homepage = "https://github.com/aioobe/eta";
license = lib.licenses.gpl3Only;
maintainers = [];
mainProgram = "eta";
platforms = lib.platforms.all;
};
}

View File

@ -1,72 +0,0 @@
From b5e7381235ed64b58b267af8f796c50b01900464 Mon Sep 17 00:00:00 2001
From: xenia <xenia@awoo.systems>
Date: Wed, 20 Nov 2024 22:16:47 -0500
Subject: [PATCH] use PATH lookup for non-cado programs
---
scripts/cadofactor/cadoprograms.py | 9 +++++++++
1 file changed, 9 insertions(+)
diff --git a/scripts/cadofactor/cadoprograms.py b/scripts/cadofactor/cadoprograms.py
index 6743480e1..946771f83 100755
--- a/scripts/cadofactor/cadoprograms.py
+++ b/scripts/cadofactor/cadoprograms.py
@@ -1,4 +1,5 @@
import os
+import shutil
import platform
import abc
import inspect
@@ -327,6 +328,8 @@ class Program(object, metaclass=InspectType):
# class attributes, which properties can't. Ergo dummy variables
binary = None
+ use_which = False
+
# This class variable definition should not be here. It gets overwritten
# when the InspectType meta-class creates the class object. The only purpose
# is to make pylint shut up about the class not having an init_signature
@@ -408,6 +411,8 @@ class Program(object, metaclass=InspectType):
self.execfile = execsubfile
elif os.path.isfile(execfile):
self.execfile = execfile
+ elif self.use_which and shutil.which(binary) is not None:
+ self.execfile = shutil.which(binary)
else:
self.execfile = os.sep.join([self.subdir, binary])
if not skip_check_binary_exists:
@@ -1251,6 +1256,7 @@ class SSH(Program):
binary = "ssh"
name = binary
path = "/usr/bin"
+ use_which = True
def __init__(self,
host: PositionalParameter(),
*args: PositionalParameter(),
@@ -1268,6 +1274,7 @@ class RSync(Program):
binary = "rsync"
name = binary
path = "/usr/bin"
+ use_which = True
def __init__(self,
sourcefile: PositionalParameter(),
remotefile: PositionalParameter(),
@@ -1278,6 +1285,7 @@ class Ls(Program):
binary = "ls"
name = binary
path = "/bin"
+ use_which = True
def __init__(self,
*args : PositionalParameter(),
long : Toggle('l')=None,
@@ -1288,6 +1296,7 @@ class Kill(Program):
binary = "kill"
name = binary
path = "/bin"
+ use_which = True
def __init__(self,
*args: PositionalParameter(),
signal: Parameter("s"),
--
2.44.2

View File

@ -1,104 +0,0 @@
{
fetchFromGitLab,
lib,
stdenv,
# library deps
ecm,
gmp,
hwloc,
python3,
sqlite,
# runtime deps
openssh,
rsync,
util-linux,
coreutils,
# build deps
cmake,
curl,
inetutils,
perl,
makeBinaryWrapper,
# options
useArch ? "znver4",
useTune ? "znver4",
}: stdenv.mkDerivation rec {
pname = "cado-nfs";
git-rev = "bb65fdf0aaee0cea5e2da368bb87651d35d02603";
version = builtins.substring 0 7 git-rev;
src = fetchFromGitLab {
domain = "gitlab.inria.fr";
owner = pname;
repo = pname;
rev = git-rev;
hash = "sha256-Ao8nX9rZ0ky7MK5qXGgMe4N160sPN/En6h/YdeI2/JU=";
};
patches = [
./0001-use-PATH-lookup-for-non-cado-programs.patch
];
buildInputs = [
gmp
ecm
python3
sqlite
hwloc
];
nativeBuildInputs = [
cmake
inetutils
curl
perl
makeBinaryWrapper
];
NIX_CFLAGS_COMPILE = "-Wno-stringop-overflow"
+ (lib.optionalString (useArch != null) " -march=${useArch}")
+ (lib.optionalString (useTune != null) " -mtune=${useTune}");
postPatch = ''
patchShebangs --build .
'';
postConfigure = ''
patchShebangs --build .
'';
cadoBinPath = lib.makeBinPath [
openssh
rsync
util-linux
coreutils
];
postInstall = ''
wrapProgram $out/bin/cado-nfs-client.py \
--prefix PATH : ${cadoBinPath}
wrapProgram $out/bin/cado-nfs.py \
--prefix PATH : ${cadoBinPath}
'';
meta = {
description = "Cado-NFS, An Implementation of the Number Field Sieve Algorithm";
longDescription = ''
CADO-NFS is a complete implementation in C/C++ of the Number Field Sieve (NFS) algorithm for
factoring integers and computing discrete logarithms in finite fields. It consists in various
programs corresponding to all the phases of the algorithm, and a general script that runs
them, possibly in parallel over a network of computers.
'';
homepage = "https://cado-nfs.gitlabpages.inria.fr/";
license = lib.licenses.lgpl21Plus;
maintainers = [];
mainProgram = "cado-nfs.py";
platforms = lib.platforms.all;
};
}

View File

@ -7,46 +7,45 @@
}: }:
let let
appId = "1690800"; appId = "1690800";
buildId = "19876517"; buildId = "15636842";
steamworks_sdk = fetchFromSteam { steamworks_sdk = fetchFromSteam {
name = "steamworks-sdk"; name = "steamworks-sdk";
inherit appId; inherit appId;
depot = { depot = {
depotId = "1006"; depotId = "1006";
manifestId = "5587033981095108078"; manifestId = "7138471031118904166";
}; };
hash = "sha256-CjrVpq5ztL6wTWIa63a/4xHM35DzgDR/O6qVf1YV5xw="; hash = "sha256-OtPI1kAx6+9G09IEr2kYchyvxlPl3rzx/ai/xEVG4oM=";
}; };
server_dist = fetchFromSteam { server_dist = fetchFromSteam {
name = "satisfactory-dedicated-server"; name = "satisfactory-dedicated-server";
inherit appId; inherit appId;
depot = { depot = {
depotId = "1690802"; depotId = "1690802";
manifestId = "7620210706575413121"; manifestId = "1910179703516567959";
}; };
hash = "sha256-jQbtHSBFCDcdycrDjIJBY4DGV7EgITvwv3k3+htZ7io="; hash = "sha256-TxPegZFAwiAzuHgw9xLGr5sAP7KAVMMfPFYL7TRX1O0=";
}; };
in stdenv.mkDerivation { in stdenv.mkDerivation {
pname = "satisfactory-dedicated-server"; pname = "satisfactory-dedicated-server";
version = "build-${buildId}"; version = "build-${buildId}";
src = server_dist; src = server_dist;
buildInputs = [ steamworks_sdk ];
propagatedBuildInputs = [ SDL2 ];
dontConfigure = true; dontConfigure = true;
dontBuild = true; dontBuild = true;
installPhase = '' installPhase = ''
mkdir -p $out/opt mkdir -p $out
cp -r . $out/opt/. cp -r . $out/.
cp -r ${steamworks_sdk}/linux64 $out/opt cp -r ${steamworks_sdk}/linux64 $out
mkdir -p $out/opt/FactoryGame/Intermediate mkdir -p $out/FactoryGame/Intermediate
mkdir -p $out/opt/FactoryGame/Saved mkdir -p $out/FactoryGame/Saved
mkdir -p $out/opt/FactoryGame/Certificates
mkdir -p $out/opt/Engine/Saved rm $out/FactoryServer.sh
rm $out/opt/FactoryServer.sh
''; '';
dontStrip = true; dontStrip = true;
@ -57,23 +56,19 @@ in stdenv.mkDerivation {
preFixup = '' preFixup = ''
echo patching binaries echo patching binaries
chmod +x $out/opt/Engine/Binaries/Linux/FactoryServer-Linux-Shipping chmod +x $out/Engine/Binaries/Linux/FactoryServer-Linux-Shipping
patchelf \ patchelf --add-needed ${SDL2}/lib/libSDL2-2.0.so.0 \
--add-needed ${SDL2}/lib/libSDL2-2.0.so.0 \ $out/linux64/steamclient.so
$out/opt/linux64/steamclient.so
patchelf \ patchelf --set-interpreter "$(cat $NIX_CC/nix-support/dynamic-linker)" \
--set-interpreter "$(cat $NIX_CC/nix-support/dynamic-linker)" \ --add-needed $out/linux64/steamclient.so \
--add-needed $out/opt/linux64/steamclient.so \ $out/Engine/Binaries/Linux/FactoryServer-Linux-Shipping
$out/opt/Engine/Binaries/Linux/FactoryServer-Linux-Shipping
''; '';
meta = with lib; { meta = with lib; {
description = "Satisfactory Dedicated Server"; description = "Satisfactory Dedicated Server";
homepage = "https://www.satisfactorygame.com/";
license = licenses.unfree; license = licenses.unfree;
platforms = [ "x86_64-linux" ]; platforms = [ "x86_64-linux" ];
sourceProvenance = [ sourceTypes.binaryNativeCode ];
}; };
} }

View File

@ -0,0 +1 @@
/nix/store/j0r1vyd1hd43rjzaday70wny2lhjkc1p-satisfactory-dedicated-server-build-15636842

View File

@ -1,13 +1,48 @@
diff --git a/Ghidra/Debug/Debugger-isf/build.gradle b/Ghidra/Debug/Debugger-isf/build.gradle From ffb6777d58f068db7e14372415154cd93f77766e Mon Sep 17 00:00:00 2001
index 2db94ed67e..925f394cf0 100644 From: roblabla <unfiltered@roblab.la>
--- a/Ghidra/Debug/Debugger-isf/build.gradle Date: Wed, 31 Jan 2024 13:19:55 +0100
+++ b/Ghidra/Debug/Debugger-isf/build.gradle Subject: [PATCH] Use com.google.protobuf:protobuf-gradle-plugin
@@ -18,11 +18,17 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
---
Ghidra/Debug/Debugger-gadp/build.gradle | 7 +-
Ghidra/Debug/Debugger-isf/build.gradle | 8 +-
Ghidra/Debug/Debugger-rmi-trace/build.gradle | 14 +--
build.gradle | 6 ++
gradle/debugger/hasProtobuf.gradle | 94 --------------------
5 files changed, 26 insertions(+), 103 deletions(-)
diff --git a/Ghidra/Debug/Debugger-gadp/build.gradle b/Ghidra/Debug/Debugger-gadp/build.gradle
index 9e1c57faf..3a3242eb5 100644
--- a/Ghidra/Debug/Debugger-gadp/build.gradle
+++ b/Ghidra/Debug/Debugger-gadp/build.gradle
@@ -18,11 +18,16 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle" apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle"
apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle" apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle"
apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle" apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle"
-apply from: "${rootProject.projectDir}/gradle/hasProtobuf.gradle" -apply from: "${rootProject.projectDir}/gradle/debugger/hasProtobuf.gradle"
+apply plugin: 'com.google.protobuf'
apply plugin: 'eclipse'
eclipse.project.name = 'Debug Debugger-gadp'
+buildscript {
+ dependencies {
+ classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
+ }
+}
dependencies {
api project(':Framework-AsyncComm')
api project(':Framework-Debugging')
diff --git a/Ghidra/Debug/Debugger-isf/build.gradle b/Ghidra/Debug/Debugger-isf/build.gradle
index d135294a0..785681ca2 100644
--- a/Ghidra/Debug/Debugger-isf/build.gradle
+++ b/Ghidra/Debug/Debugger-isf/build.gradle
@@ -18,11 +18,15 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle"
apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle"
apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle"
-apply from: "${rootProject.projectDir}/gradle/debugger/hasProtobuf.gradle"
-
+apply plugin: 'com.google.protobuf' +apply plugin: 'com.google.protobuf'
apply plugin: 'eclipse' apply plugin: 'eclipse'
eclipse.project.name = 'Debug Debugger-isf' eclipse.project.name = 'Debug Debugger-isf'
@ -17,22 +52,21 @@ index 2db94ed67e..925f394cf0 100644
+ classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18' + classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
+ } + }
+} +}
+
dependencies { dependencies {
api project(':ProposedUtils') api project(':Framework-AsyncComm')
} api project(':Framework-Debugging')
diff --git a/Ghidra/Debug/Debugger-rmi-trace/build.gradle b/Ghidra/Debug/Debugger-rmi-trace/build.gradle diff --git a/Ghidra/Debug/Debugger-rmi-trace/build.gradle b/Ghidra/Debug/Debugger-rmi-trace/build.gradle
index 4fa3b9a539..2663aeaeb0 100644 index 40fbc17ab..7517ffe6e 100644
--- a/Ghidra/Debug/Debugger-rmi-trace/build.gradle --- a/Ghidra/Debug/Debugger-rmi-trace/build.gradle
+++ b/Ghidra/Debug/Debugger-rmi-trace/build.gradle +++ b/Ghidra/Debug/Debugger-rmi-trace/build.gradle
@@ -19,12 +19,17 @@ apply from: "${rootProject.projectDir}/gradle/helpProject.gradle" @@ -18,12 +18,17 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle" apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle"
apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle" apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle"
apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle" apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle"
-apply from: "${rootProject.projectDir}/gradle/hasProtobuf.gradle" -apply from: "${rootProject.projectDir}/gradle/debugger/hasProtobuf.gradle"
apply from: "${rootProject.projectDir}/gradle/hasPythonPackage.gradle"
-
+apply plugin: 'com.google.protobuf' +apply plugin: 'com.google.protobuf'
apply from: "${rootProject.projectDir}/gradle/debugger/hasPythonPackage.gradle"
apply plugin: 'eclipse' apply plugin: 'eclipse'
eclipse.project.name = 'Debug Debugger-rmi-trace' eclipse.project.name = 'Debug Debugger-rmi-trace'
@ -41,33 +75,30 @@ index 4fa3b9a539..2663aeaeb0 100644
+ classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18' + classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
+ } + }
+} +}
+
dependencies { dependencies {
api project(':ProposedUtils')
api project(':Pty') api project(':Pty')
@@ -37,13 +42,10 @@ dependencies { api project(':Debugger')
} @@ -44,12 +49,9 @@ task generateProtoPy {
ext.outdir = file("build/generated/source/proto/main/py")
task configureGenerateProtoPy { outputs.dir(outdir)
inputs.files(src)
- dependsOn(configurations.protocArtifact) - dependsOn(configurations.protocArtifact)
+ dependsOn(protobuf.generateProtoTasks.all()) + dependsOn(protobuf.generateProtoTasks.all())
doLast {
- doLast {
- def exe = configurations.protocArtifact.first() - def exe = configurations.protocArtifact.first()
- if (!isCurrentWindows()) { - if (!isCurrentWindows()) {
- exe.setExecutable(true) - exe.setExecutable(true)
- } - }
+ doLast {
+ def exe = protobuf.tools.protoc.path + def exe = protobuf.tools.protoc.path
generateProtoPy.commandLine exe exec {
generateProtoPy.args "--python_out=${generateProtoPy.outdir}" commandLine exe, "--python_out=$outdir", "-I$srcdir"
generateProtoPy.args "--pyi_out=${generateProtoPy.stubsOutdir}" args src
diff --git a/build.gradle b/build.gradle diff --git a/build.gradle b/build.gradle
index 159eb7dd7b..ef4add1ad8 100644 index b0c717fb1..5f56506a5 100644
--- a/build.gradle --- a/build.gradle
+++ b/build.gradle +++ b/build.gradle
@@ -80,6 +80,12 @@ if (flatRepo.isDirectory()) { @@ -74,6 +74,12 @@ if (flatRepo.isDirectory()) {
mavenCentral() jcenter()
flatDir name: "flat", dirs:["$flatRepo"] flatDir name: "flat", dirs:["$flatRepo"]
} }
+ buildscript { + buildscript {
@ -79,21 +110,20 @@ index 159eb7dd7b..ef4add1ad8 100644
} }
} }
else { else {
diff --git a/gradle/hasProtobuf.gradle b/gradle/hasProtobuf.gradle diff --git a/gradle/debugger/hasProtobuf.gradle b/gradle/debugger/hasProtobuf.gradle
deleted file mode 100644 index 23b4ce74b..e69de29bb 100644
index a8c176bcbe..0000000000 --- a/gradle/debugger/hasProtobuf.gradle
--- a/gradle/hasProtobuf.gradle +++ b/gradle/debugger/hasProtobuf.gradle
+++ /dev/null @@ -1,94 +0,0 @@
@@ -1,98 +0,0 @@
-/* ### -/* ###
- * IP: GHIDRA - * IP: GHIDRA
- * - *
- * Licensed under the Apache License, Version 2.0 (the "License"); - * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License. - * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at - * You may obtain a copy of the License at
- * - *
- * http://www.apache.org/licenses/LICENSE-2.0 - * http://www.apache.org/licenses/LICENSE-2.0
- * - *
- * Unless required by applicable law or agreed to in writing, software - * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS, - * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
@ -146,22 +176,7 @@ index a8c176bcbe..0000000000
- } - }
-}*/ -}*/
- -
-task configureGenerateProto { -task generateProto {
- dependsOn(configurations.protocArtifact)
-
- doLast {
- def exe = configurations.protocArtifact.first()
- if (!isCurrentWindows()) {
- exe.setExecutable(true)
- }
- generateProto.commandLine exe, "--java_out=${generateProto.outdir}", "-I${generateProto.srcdir}"
- generateProto.args generateProto.src
- }
-}
-
-// Can't use providers.exec, or else we see no output
-task generateProto(type:Exec) {
- dependsOn(configureGenerateProto)
- ext.srcdir = file("src/main/proto") - ext.srcdir = file("src/main/proto")
- ext.src = fileTree(srcdir) { - ext.src = fileTree(srcdir) {
- include "**/*.proto" - include "**/*.proto"
@ -169,6 +184,17 @@ index a8c176bcbe..0000000000
- ext.outdir = file("build/generated/source/proto/main/java") - ext.outdir = file("build/generated/source/proto/main/java")
- outputs.dir(outdir) - outputs.dir(outdir)
- inputs.files(src) - inputs.files(src)
- dependsOn(configurations.protocArtifact)
- doLast {
- def exe = configurations.protocArtifact.first()
- if (!isCurrentWindows()) {
- exe.setExecutable(true)
- }
- exec {
- commandLine exe, "--java_out=$outdir", "-I$srcdir"
- args src
- }
- }
-} -}
- -
-tasks.compileJava.dependsOn(tasks.generateProto) -tasks.compileJava.dependsOn(tasks.generateProto)
@ -183,3 +209,6 @@ index a8c176bcbe..0000000000
- } - }
-} -}
-zipSourceSubproject.dependsOn generateProto -zipSourceSubproject.dependsOn generateProto
--
2.42.0

View File

@ -0,0 +1,78 @@
{ lib
, stdenv
, unzip
, jdk
, gradle
, ghidra
}:
let
metaCommon = oldMeta:
oldMeta // (with lib; {
maintainers = (oldMeta.maintainers or []) ++ (with maintainers; [ vringar ]);
platforms = oldMeta.platforms or ghidra.meta.platforms;
});
buildGhidraExtension = {
pname, nativeBuildInputs ? [], meta ? { }, ...
}@args:
stdenv.mkDerivation (args // {
nativeBuildInputs = nativeBuildInputs ++ [
unzip
jdk
gradle
];
buildPhase = args.buildPhase or ''
runHook preBuild
# Set project name, otherwise defaults to directory name
echo -e '\nrootProject.name = "${pname}"' >> settings.gradle
export GRADLE_USER_HOME=$(mktemp -d)
gradle \
--offline \
--no-daemon \
-PGHIDRA_INSTALL_DIR=${ghidra}/lib/ghidra
runHook postBuild
'';
installPhase = args.installPhase or ''
runHook preInstall
mkdir -p $out/lib/ghidra/Ghidra/Extensions
unzip -d $out/lib/ghidra/Ghidra/Extensions dist/*.zip
runHook postInstall
'';
meta = metaCommon meta;
});
buildGhidraScripts = { pname, meta ? { }, ... }@args:
stdenv.mkDerivation (args // {
installPhase = ''
runHook preInstall
GHIDRA_HOME=$out/lib/ghidra/Ghidra/Extensions/${pname}
mkdir -p $GHIDRA_HOME
cp -r . $GHIDRA_HOME/ghidra_scripts
touch $GHIDRA_HOME/Module.manifest
cat <<'EOF' > extension.properties
name=${pname}
description=${meta.description or ""}
author=
createdOn=
version=${lib.getVersion ghidra}
EOF
runHook postInstall
'';
meta = metaCommon meta;
});
in
{ inherit buildGhidraExtension buildGhidraScripts; }

View File

@ -3,9 +3,10 @@
fetchFromGitHub, fetchFromGitHub,
lib, lib,
callPackage, callPackage,
gradle_8, gradle_7,
perl,
makeBinaryWrapper, makeBinaryWrapper,
openjdk21, openjdk17,
unzip, unzip,
makeDesktopItem, makeDesktopItem,
copyDesktopItems, copyDesktopItems,
@ -19,9 +20,7 @@
let let
pname = "ghidra"; pname = "ghidra";
version = "11.4.2"; version = "11.1.1";
isMacArm64 = stdenv.hostPlatform.isDarwin && stdenv.hostPlatform.isAarch64;
releaseName = "NIX"; releaseName = "NIX";
distroPrefix = "ghidra_${version}_${releaseName}"; distroPrefix = "ghidra_${version}_${releaseName}";
@ -29,7 +28,7 @@ let
owner = "NationalSecurityAgency"; owner = "NationalSecurityAgency";
repo = "Ghidra"; repo = "Ghidra";
rev = "Ghidra_${version}_build"; rev = "Ghidra_${version}_build";
hash = "sha256-/veSp2WuGOF0cYwUC4QFJD6kaMae5NuKrQ5Au4LjDe8="; hash = "sha256-t96FcAK3JwO66dOf4OhpOfU8CQfAczfF61Cg7m+B3fA=";
# populate values that require us to use git. By doing this in postFetch we # populate values that require us to use git. By doing this in postFetch we
# can delete .git afterwards and maintain better reproducibility of the src. # can delete .git afterwards and maintain better reproducibility of the src.
leaveDotGit = true; leaveDotGit = true;
@ -44,6 +43,8 @@ let
''; '';
}; };
gradle = gradle_7;
patches = [ patches = [
# Use our own protoc binary instead of the prebuilt one # Use our own protoc binary instead of the prebuilt one
./0001-Use-protobuf-gradle-plugin.patch ./0001-Use-protobuf-gradle-plugin.patch
@ -65,7 +66,7 @@ let
echo "application.revision.ghidra=$(cat COMMIT)" >> Ghidra/application.properties echo "application.revision.ghidra=$(cat COMMIT)" >> Ghidra/application.properties
# Tells ghidra to use our own protoc binary instead of the prebuilt one. # Tells ghidra to use our own protoc binary instead of the prebuilt one.
tee -a Ghidra/Debug/Debugger-{isf,rmi-trace}/build.gradle <<HERE cat >>Ghidra/Debug/Debugger-gadp/build.gradle <<HERE
protobuf { protobuf {
protoc { protoc {
path = '${protobuf}/bin/protoc' path = '${protobuf}/bin/protoc'
@ -74,9 +75,69 @@ let
HERE HERE
''; '';
# "Deprecated Gradle features were used in this build, making it incompatible with Gradle 9.0." # Adds a gradle step that downloads all the dependencies to the gradle cache.
gradle = gradle_8; addResolveStep = ''
cat >>build.gradle <<HERE
task resolveDependencies {
doLast {
project.rootProject.allprojects.each { subProject ->
subProject.buildscript.configurations.each { configuration ->
resolveConfiguration(subProject, configuration, "buildscript config \''${configuration.name}")
}
subProject.configurations.each { configuration ->
resolveConfiguration(subProject, configuration, "config \''${configuration.name}")
}
}
}
}
void resolveConfiguration(subProject, configuration, name) {
if (configuration.canBeResolved) {
logger.info("Resolving project {} {}", subProject.name, name)
configuration.resolve()
}
}
HERE
'';
# fake build to pre-download deps into fixed-output derivation
# Taken from mindustry derivation.
deps = stdenv.mkDerivation {
pname = "${pname}-deps";
inherit version src patches;
postPatch = addResolveStep;
nativeBuildInputs = [
gradle
perl
] ++ lib.optional stdenv.isDarwin xcbuild;
buildPhase = ''
runHook preBuild
export HOME="$NIX_BUILD_TOP/home"
mkdir -p "$HOME"
export JAVA_TOOL_OPTIONS="-Duser.home='$HOME'"
export GRADLE_USER_HOME="$HOME/.gradle"
# First, fetch the static dependencies.
gradle --no-daemon --info -Dorg.gradle.java.home=${openjdk17} -I gradle/support/fetchDependencies.gradle init
# Then, fetch the maven dependencies.
gradle --no-daemon --info -Dorg.gradle.java.home=${openjdk17} resolveDependencies
runHook postBuild
'';
# perl code mavenizes pathes (com.squareup.okio/okio/1.13.0/a9283170b7305c8d92d25aff02a6ab7e45d06cbe/okio-1.13.0.jar -> com/squareup/okio/okio/1.13.0/okio-1.13.0.jar)
installPhase = ''
runHook preInstall
find $GRADLE_USER_HOME/caches/modules-2 -type f -regex '.*\.\(jar\|pom\)' \
| perl -pe 's#(.*/([^/]+)/([^/]+)/([^/]+)/[0-9a-f]{30,40}/([^/\s]+))$# ($x = $2) =~ tr|\.|/|; "install -Dm444 $1 \$out/maven/$x/$3/$4/$5" #e' \
| sh
cp -r dependencies $out/dependencies
runHook postInstall
'';
outputHashAlgo = "sha256";
outputHashMode = "recursive";
outputHash = "sha256-66gL4UFlBUo2JIEOXoF6tFvXtBdEX4b2MeSrV1b6Vg4=";
};
in in
stdenv.mkDerivation (finalAttrs: { stdenv.mkDerivation (finalAttrs: {
inherit inherit
@ -87,7 +148,7 @@ stdenv.mkDerivation (finalAttrs: {
postPatch postPatch
; ;
outputs = [ "out" "lib" "doc" ]; outputs = ["bin" "out" "lib" "doc"];
# Don't create .orig files if the patch isn't an exact match. # Don't create .orig files if the patch isn't an exact match.
patchFlags = [ patchFlags = [
@ -118,7 +179,7 @@ stdenv.mkDerivation (finalAttrs: {
python3 python3
python3Packages.pip python3Packages.pip
] ]
++ lib.optionals stdenv.hostPlatform.isDarwin [ ++ lib.optionals stdenv.isDarwin [
xcbuild xcbuild
desktopToDarwinBundle desktopToDarwinBundle
]; ];
@ -127,121 +188,54 @@ stdenv.mkDerivation (finalAttrs: {
__darwinAllowLocalNetworking = true; __darwinAllowLocalNetworking = true;
mitmCache = gradle.fetchDeps { buildPhase = ''
inherit pname; runHook preBuild
data = ./deps.json; export HOME="$NIX_BUILD_TOP/home"
}; mkdir -p "$HOME"
export JAVA_TOOL_OPTIONS="-Duser.home='$HOME'"
gradleFlags = ln -s ${deps}/dependencies dependencies
[ "-Dorg.gradle.java.home=${openjdk21}" ]
++ lib.optionals isMacArm64 [
# For some reason I haven't been able to figure out yet, ghidra builds for
# arm64 seems to build the x64 binaries of the decompiler. These fail to
# build due to trying to link the x64 object files with arm64 stdc++
# library, which obviously fails.
#
# Those binaries are entirely unnecessary anyways, since we're targeting
# arm64 build here, so let's exclude them from the build.
"-x"
"Decompiler:linkSleighMac_x86_64Executable"
"-x"
"Decompiler:linkDecompileMac_x86_64Executable"
];
preBuild = '' sed -i "s#mavenLocal()#mavenLocal(); maven { url '${deps}/maven' }#g" build.gradle
export JAVA_TOOL_OPTIONS="-Duser.home=$NIX_BUILD_TOP/home"
gradle -I gradle/support/fetchDependencies.gradle gradle --offline --no-daemon --info -Dorg.gradle.java.home=${openjdk17} buildGhidra
runHook postBuild
''; '';
gradleBuildTask = "buildGhidra";
installPhase = '' installPhase = ''
runHook preInstall runHook preInstall
mkdir -p "$lib/lib/ghidra" "$out/share/applications" "$doc/share/doc" mkdir -p "$lib/lib/ghidra" "$bin/share/applications" "$doc/share/doc"
ZIP=build/dist/$(ls build/dist) ZIP=build/dist/$(ls build/dist)
echo $ZIP echo $ZIP
unzip $ZIP -d "$lib/lib/ghidra" unzip $ZIP -d "$lib/lib/ghidra"
f=("$lib/lib/ghidra"/*) f=("$lib/lib/ghidra"/*)
mv "$lib/lib/ghidra"/*/* "$lib/lib/ghidra" mv "$lib/lib/ghidra"/*/* "$lib/lib/ghidra"
rmdir "''${f[@]}" rmdir "''${f[@]}"
mv "$lib/lib/ghidra/docs" "$doc/share/doc/ghidra" mv "$lib/lib/ghidra/docs" "$doc/share/doc/ghidra"
# the builtin help viewer needs the following to stay in-tree
mkdir "$lib/lib/ghidra/docs"
cp "$doc/share/doc/ghidra/WhatsNew.html" "$lib/lib/ghidra/docs"
cp "$doc/share/doc/ghidra/README_PDB.html" "$lib/lib/ghidra/docs"
for path in server/svrREADME.html support/GhidraGo/ghidraGoREADME.html support/analyzeHeadlessREADME.html support/buildGhidraJarREADME.txt; do for path in server/svrREADME.html support/GhidraGo/ghidraGoREADME.html support/analyzeHeadlessREADME.html support/buildGhidraJarREADME.txt; do
out_path="$(basename "$path")" out_path="$(basename "$path")"
mv "$lib/lib/ghidra/$path" "$doc/share/doc/ghidra/$out_path" mv "$lib/lib/ghidra/$path" "$doc/share/doc/ghidra/$out_path"
done done
unzip "$doc/share/doc/ghidra/GhidraAPI_javadoc.zip" -d "$doc/share/doc/ghidra"
rm "$doc/share/doc/ghidra/GhidraAPI_javadoc.zip"
for f in Ghidra/Framework/Gui/src/main/resources/images/GhidraIcon*.png; do for f in Ghidra/Framework/Gui/src/main/resources/images/GhidraIcon*.png; do
res=$(basename "$f" ".png" | cut -d"_" -f3 | cut -c11-) res=$(basename "$f" ".png" | cut -d"_" -f3 | cut -c11-)
install -Dm444 "$f" "$out/share/icons/hicolor/''${res}x''${res}/apps/ghidra.png" install -Dm444 "$f" "$bin/share/icons/hicolor/''${res}x''${res}/apps/ghidra.png"
done; done;
# improved macOS icon support # improved macOS icon support
install -Dm444 Ghidra/Framework/Gui/src/main/resources/images/GhidraIcon64.png $out/share/icons/hicolor/32x32@2/apps/ghidra.png install -Dm444 Ghidra/Framework/Gui/src/main/resources/images/GhidraIcon64.png $bin/share/icons/hicolor/32x32@2/apps/ghidra.png
runHook postInstall runHook postInstall
''; '';
postFixup = postFixup = ''
let mkdir -p "$bin/bin"
javaArgs = [ ln -s "$lib/lib/ghidra/ghidraRun" "$bin/bin/ghidra"
"-showversion" wrapProgram "$lib/lib/ghidra/support/launch.sh" \
"-cp $lib/lib/ghidra/Ghidra/Framework/Utility/lib/Utility.jar" --set-default NIX_GHIDRAHOME "$lib/lib/ghidra/Ghidra" \
--prefix PATH : ${lib.makeBinPath [ openjdk17 ]}
"-Djava.system.class.loader=ghidra.GhidraClassLoader" '';
"-Xshare:off"
"-Dfile.encoding=UTF8"
"-Dpython.console.encoding=UTF-8"
"-Duser.country=US"
"-Duser.language=en"
"-Duser.variant="
"-Dsun.java2d.opengl=false"
"-Dfont.size.override="
"-Djdk.tls.client.protocols=TLSv1.2,TLSv1.3"
"-Dcpu.core.limit="
"-Dcpu.core.override="
] ++ (lib.optionals stdenv.hostPlatform.isDarwin [
"-Xdock:name=$APPNAME"
"-Declipse.filelock.disable=true"
"-Dapple.laf.useScreenMenuBar=false"
"-Dapple.awt.application.appearance=system"
]) ++ (lib.optionals stdenv.hostPlatform.isLinux [
"-Dsun.java2d.pmoffscreen=false"
"-Dsun.java2d.xrender=true"
"-Dsun.java2d.uiScale=1"
"-Dawt.useSystemAAFontSettings=on"
]);
in ''
mkdir -p "$out/bin"
APPNAME=Ghidra
makeWrapper "${openjdk21}/bin/java" "$out/bin/ghidra" \
--set-default NIX_GHIDRAHOME "$lib/lib/ghidra/Ghidra" \
--prefix PATH : ${lib.makeBinPath [ openjdk21 ]} \
--add-flags "${lib.strings.concatStringsSep " " javaArgs}" \
--add-flags "ghidra.Ghidra ghidra.GhidraRun"
APPNAME=Ghidra-Headless
makeWrapper "${openjdk21}/bin/java" "$out/bin/ghidra-analyzeHeadless" \
--set-default NIX_GHIDRAHOME "$lib/lib/ghidra/Ghidra" \
--prefix PATH : ${lib.makeBinPath [ openjdk21 ]} \
--add-flags "${lib.strings.concatStringsSep " " javaArgs}" \
--add-flags "-Xmx2G -XX:ParallelGCThreads=2 -XX:CICompilerCount=2" \
--add-flags "ghidra.Ghidra ghidra.app.util.headless.AnalyzeHeadless"
'';
passthru = { passthru = {
inherit releaseName distroPrefix; inherit releaseName distroPrefix;
@ -273,6 +267,6 @@ stdenv.mkDerivation (finalAttrs: {
roblabla roblabla
vringar vringar
]; ];
broken = stdenv.hostPlatform.isDarwin && stdenv.hostPlatform.isx86_64; broken = stdenv.isDarwin && stdenv.isx86_64;
}; };
}) })

View File

@ -0,0 +1,83 @@
{ stdenv
, fetchzip
, lib
, makeWrapper
, autoPatchelfHook
, openjdk17
, pam
, makeDesktopItem
, icoutils
}:
let
pkg_path = "$out/lib/ghidra";
desktopItem = makeDesktopItem {
name = "ghidra";
exec = "ghidra";
icon = "ghidra";
desktopName = "Ghidra";
genericName = "Ghidra Software Reverse Engineering Suite";
categories = [ "Development" ];
terminal = false;
startupWMClass = "ghidra-Ghidra";
};
in stdenv.mkDerivation rec {
pname = "ghidra";
version = "10.4";
versiondate = "20230928";
src = fetchzip {
url = "https://github.com/NationalSecurityAgency/ghidra/releases/download/Ghidra_${version}_build/ghidra_${version}_PUBLIC_${versiondate}.zip";
hash = "sha256-IiAQ9OKmr8ZgqmGftuW0ITdG06fb9Lr30n2H9GArctk=";
};
nativeBuildInputs = [
makeWrapper
icoutils
]
++ lib.optionals stdenv.isLinux [ autoPatchelfHook ];
buildInputs = [
stdenv.cc.cc.lib
pam
];
dontStrip = true;
installPhase = ''
mkdir -p "${pkg_path}"
mkdir -p "${pkg_path}" "$out/share/applications"
cp -a * "${pkg_path}"
ln -s ${desktopItem}/share/applications/* $out/share/applications
icotool -x "${pkg_path}/support/ghidra.ico"
rm ghidra_4_40x40x32.png
for f in ghidra_*.png; do
res=$(basename "$f" ".png" | cut -d"_" -f3 | cut -d"x" -f1-2)
mkdir -pv "$out/share/icons/hicolor/$res/apps"
mv "$f" "$out/share/icons/hicolor/$res/apps/ghidra.png"
done;
'';
postFixup = ''
mkdir -p "$out/bin"
ln -s "${pkg_path}/ghidraRun" "$out/bin/ghidra"
wrapProgram "${pkg_path}/support/launch.sh" \
--prefix PATH : ${lib.makeBinPath [ openjdk17 ]}
'';
meta = with lib; {
description = "Software reverse engineering (SRE) suite of tools developed by NSA's Research Directorate in support of the Cybersecurity mission";
mainProgram = "ghidra";
homepage = "https://github.com/NationalSecurityAgency/ghidra";
platforms = [ "x86_64-linux" "x86_64-darwin" ];
sourceProvenance = with sourceTypes; [ binaryBytecode ];
license = licenses.asl20;
maintainers = with maintainers; [ ck3d govanify mic92 ];
};
}

View File

@ -0,0 +1,14 @@
{ lib, newScope, callPackage, ghidra }:
lib.makeScope newScope (self: {
inherit (callPackage ./build-extension.nix { inherit ghidra; }) buildGhidraExtension buildGhidraScripts;
ghidraninja-ghidra-scripts = self.callPackage ./extensions/ghidraninja-ghidra-scripts { };
gnudisassembler = self.callPackage ./extensions/gnudisassembler { inherit ghidra; };
machinelearning = self.callPackage ./extensions/machinelearning { inherit ghidra; };
sleighdevtools = self.callPackage ./extensions/sleighdevtools { inherit ghidra; };
})

View File

@ -1,10 +1,9 @@
{ { lib
lib, , fetchFromGitHub
fetchFromGitHub, , buildGhidraScripts
buildGhidraScripts, , binwalk
binwalk, , swift
swift, , yara
yara,
}: }:
buildGhidraScripts { buildGhidraScripts {

View File

@ -1,15 +1,14 @@
{ { lib
lib, , stdenv
stdenv, , fetchurl
fetchurl, , buildGhidraExtension
buildGhidraExtension, , ghidra
ghidra, , flex
flex, , bison
bison, , texinfo
texinfo, , perl
perl, , zlib
zlib, , xcbuild
xcbuild,
}: }:
let let
@ -25,7 +24,7 @@ buildGhidraExtension {
pname = "gnudisassembler"; pname = "gnudisassembler";
version = lib.getVersion ghidra; version = lib.getVersion ghidra;
src = "${ghidra.lib}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_GnuDisassembler.zip"; src = "${ghidra}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_GnuDisassembler.zip";
postPatch = '' postPatch = ''
ln -s ${binutils-src} binutils-${binutils-version}.tar.bz2 ln -s ${binutils-src} binutils-${binutils-version}.tar.bz2
@ -42,11 +41,13 @@ buildGhidraExtension {
bison bison
texinfo texinfo
perl perl
] ] ++ lib.optionals stdenv.hostPlatform.isDarwin [
++ lib.optionals stdenv.hostPlatform.isDarwin [ xcbuild ]; xcbuild
];
buildInputs = [ zlib ]; buildInputs = [
gradleBuildTask = "assemble"; zlib
];
installPhase = '' installPhase = ''
runHook preInstall runHook preInstall

View File

@ -1,14 +1,13 @@
{ { lib
lib, , buildGhidraExtension
buildGhidraExtension, , ghidra
ghidra,
}: }:
buildGhidraExtension { buildGhidraExtension {
pname = "machinelearning"; pname = "machinelearning";
version = lib.getVersion ghidra; version = lib.getVersion ghidra;
src = "${ghidra.lib}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_MachineLearning.zip"; src = "${ghidra}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_MachineLearning.zip";
dontUnpack = true; dontUnpack = true;
# Built as part ghidra # Built as part ghidra
@ -29,7 +28,7 @@ buildGhidraExtension {
downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/Ghidra/Extensions/MachineLearning"; downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/Ghidra/Extensions/MachineLearning";
sourceProvenance = with sourceTypes; [ sourceProvenance = with sourceTypes; [
fromSource fromSource
binaryBytecode # deps binaryBytecode # deps
]; ];
}; };
} }

View File

@ -1,15 +1,14 @@
{ { lib
lib, , buildGhidraExtension
buildGhidraExtension, , ghidra
ghidra, , python3
python3,
}: }:
buildGhidraExtension { buildGhidraExtension {
pname = "sleighdevtools"; pname = "sleighdevtools";
version = lib.getVersion ghidra; version = lib.getVersion ghidra;
src = "${ghidra.lib}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_SleighDevTools.zip"; src = "${ghidra}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_SleighDevTools.zip";
dontUnpack = true; dontUnpack = true;
# Built as part ghidra # Built as part ghidra
@ -35,7 +34,7 @@ buildGhidraExtension {
downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/Ghidra/Extensions/SleighDevTools"; downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/Ghidra/Extensions/SleighDevTools";
sourceProvenance = with sourceTypes; [ sourceProvenance = with sourceTypes; [
fromSource fromSource
binaryBytecode # deps binaryBytecode # deps
]; ];
}; };
} }

View File

@ -0,0 +1,36 @@
{ lib
, stdenv
, callPackage
, symlinkJoin
, makeBinaryWrapper
, desktopToDarwinBundle
, ghidra
}:
let
ghidra-extensions = callPackage ./extensions.nix { inherit ghidra; };
allExtensions = lib.filterAttrs (n: pkg: lib.isDerivation pkg) ghidra-extensions;
/* Make Ghidra with additional extensions
Example:
pkgs.ghidra.withExtensions (p: with p; [
ghostrings
]);
=> /nix/store/3yn0rbnz5mbrxf0x70jbjq73wgkszr5c-ghidra-with-extensions-10.2.2
*/
withExtensions = f: (symlinkJoin {
name = "${ghidra.pname}-with-extensions-${lib.getVersion ghidra}";
paths = (f allExtensions);
nativeBuildInputs = [ makeBinaryWrapper ]
++ lib.optional stdenv.hostPlatform.isDarwin desktopToDarwinBundle;
postBuild = ''
makeWrapper '${ghidra}/bin/ghidra' "$out/bin/ghidra" \
--set NIX_GHIDRAHOME "$out/lib/ghidra/Ghidra"
ln -s ${ghidra}/share $out/share
'' + lib.optionalString stdenv.hostPlatform.isDarwin ''
convertDesktopFiles $prefix
'';
inherit (ghidra) meta;
});
in
withExtensions

View File

@ -0,0 +1,214 @@
From ffb6777d58f068db7e14372415154cd93f77766e Mon Sep 17 00:00:00 2001
From: roblabla <unfiltered@roblab.la>
Date: Wed, 31 Jan 2024 13:19:55 +0100
Subject: [PATCH] Use com.google.protobuf:protobuf-gradle-plugin
---
Ghidra/Debug/Debugger-gadp/build.gradle | 7 +-
Ghidra/Debug/Debugger-isf/build.gradle | 8 +-
Ghidra/Debug/Debugger-rmi-trace/build.gradle | 14 +--
build.gradle | 6 ++
gradle/debugger/hasProtobuf.gradle | 94 --------------------
5 files changed, 26 insertions(+), 103 deletions(-)
diff --git a/Ghidra/Debug/Debugger-gadp/build.gradle b/Ghidra/Debug/Debugger-gadp/build.gradle
index 9e1c57faf..3a3242eb5 100644
--- a/Ghidra/Debug/Debugger-gadp/build.gradle
+++ b/Ghidra/Debug/Debugger-gadp/build.gradle
@@ -18,11 +18,16 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle"
apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle"
apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle"
-apply from: "${rootProject.projectDir}/gradle/debugger/hasProtobuf.gradle"
+apply plugin: 'com.google.protobuf'
apply plugin: 'eclipse'
eclipse.project.name = 'Debug Debugger-gadp'
+buildscript {
+ dependencies {
+ classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
+ }
+}
dependencies {
api project(':Framework-AsyncComm')
api project(':Framework-Debugging')
diff --git a/Ghidra/Debug/Debugger-isf/build.gradle b/Ghidra/Debug/Debugger-isf/build.gradle
index d135294a0..785681ca2 100644
--- a/Ghidra/Debug/Debugger-isf/build.gradle
+++ b/Ghidra/Debug/Debugger-isf/build.gradle
@@ -18,11 +18,15 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle"
apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle"
apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle"
-apply from: "${rootProject.projectDir}/gradle/debugger/hasProtobuf.gradle"
-
+apply plugin: 'com.google.protobuf'
apply plugin: 'eclipse'
eclipse.project.name = 'Debug Debugger-isf'
+buildscript {
+ dependencies {
+ classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
+ }
+}
dependencies {
api project(':Framework-AsyncComm')
api project(':Framework-Debugging')
diff --git a/Ghidra/Debug/Debugger-rmi-trace/build.gradle b/Ghidra/Debug/Debugger-rmi-trace/build.gradle
index 40fbc17ab..7517ffe6e 100644
--- a/Ghidra/Debug/Debugger-rmi-trace/build.gradle
+++ b/Ghidra/Debug/Debugger-rmi-trace/build.gradle
@@ -18,12 +18,17 @@ apply from: "${rootProject.projectDir}/gradle/javaProject.gradle"
apply from: "${rootProject.projectDir}/gradle/jacocoProject.gradle"
apply from: "${rootProject.projectDir}/gradle/javaTestProject.gradle"
apply from: "${rootProject.projectDir}/gradle/distributableGhidraModule.gradle"
-apply from: "${rootProject.projectDir}/gradle/debugger/hasProtobuf.gradle"
+apply plugin: 'com.google.protobuf'
apply from: "${rootProject.projectDir}/gradle/debugger/hasPythonPackage.gradle"
apply plugin: 'eclipse'
eclipse.project.name = 'Debug Debugger-rmi-trace'
+buildscript {
+ dependencies {
+ classpath 'com.google.protobuf:protobuf-gradle-plugin:0.8.18'
+ }
+}
dependencies {
api project(':Pty')
api project(':Debugger')
@@ -44,12 +49,9 @@ task generateProtoPy {
ext.outdir = file("build/generated/source/proto/main/py")
outputs.dir(outdir)
inputs.files(src)
- dependsOn(configurations.protocArtifact)
+ dependsOn(protobuf.generateProtoTasks.all())
doLast {
- def exe = configurations.protocArtifact.first()
- if (!isCurrentWindows()) {
- exe.setExecutable(true)
- }
+ def exe = protobuf.tools.protoc.path
exec {
commandLine exe, "--python_out=$outdir", "-I$srcdir"
args src
diff --git a/build.gradle b/build.gradle
index b0c717fb1..5f56506a5 100644
--- a/build.gradle
+++ b/build.gradle
@@ -74,6 +74,12 @@ if (flatRepo.isDirectory()) {
jcenter()
flatDir name: "flat", dirs:["$flatRepo"]
}
+ buildscript {
+ repositories {
+ mavenLocal()
+ mavenCentral()
+ }
+ }
}
}
else {
diff --git a/gradle/debugger/hasProtobuf.gradle b/gradle/debugger/hasProtobuf.gradle
index 23b4ce74b..e69de29bb 100644
--- a/gradle/debugger/hasProtobuf.gradle
+++ b/gradle/debugger/hasProtobuf.gradle
@@ -1,94 +0,0 @@
-/* ###
- * IP: GHIDRA
- *
- * Licensed under the Apache License, Version 2.0 (the "License");
- * you may not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-/*plugins {
- id 'com.google.protobuf' version '0.8.10'
-}*/
-
-configurations {
- allProtocArtifacts
- protocArtifact
-}
-
-def platform = getCurrentPlatformName()
-
-
-dependencies {
- allProtocArtifacts 'com.google.protobuf:protoc:3.21.8:windows-x86_64@exe'
- allProtocArtifacts 'com.google.protobuf:protoc:3.21.8:linux-x86_64@exe'
- allProtocArtifacts 'com.google.protobuf:protoc:3.21.8:linux-aarch_64@exe'
- allProtocArtifacts 'com.google.protobuf:protoc:3.21.8:osx-x86_64@exe'
- allProtocArtifacts 'com.google.protobuf:protoc:3.21.8:osx-aarch_64@exe'
-
- if (isCurrentWindows()) {
- protocArtifact 'com.google.protobuf:protoc:3.21.8:windows-x86_64@exe'
- }
- if (isCurrentLinux()) {
- if (platform.endsWith("x86_64")) {
- protocArtifact 'com.google.protobuf:protoc:3.21.8:linux-x86_64@exe'
- }
- else {
- protocArtifact 'com.google.protobuf:protoc:3.21.8:linux-aarch_64@exe'
- }
- }
- if (isCurrentMac()) {
- if (platform.endsWith("x86_64")) {
- protocArtifact 'com.google.protobuf:protoc:3.21.8:osx-x86_64@exe'
- }
- else {
- protocArtifact 'com.google.protobuf:protoc:3.21.8:osx-aarch_64@exe'
- }
- }
-}
-
-/*protobuf {
- protoc {
- artifact = 'com.google.protobuf:protoc:3.21.8'
- }
-}*/
-
-task generateProto {
- ext.srcdir = file("src/main/proto")
- ext.src = fileTree(srcdir) {
- include "**/*.proto"
- }
- ext.outdir = file("build/generated/source/proto/main/java")
- outputs.dir(outdir)
- inputs.files(src)
- dependsOn(configurations.protocArtifact)
- doLast {
- def exe = configurations.protocArtifact.first()
- if (!isCurrentWindows()) {
- exe.setExecutable(true)
- }
- exec {
- commandLine exe, "--java_out=$outdir", "-I$srcdir"
- args src
- }
- }
-}
-
-tasks.compileJava.dependsOn(tasks.generateProto)
-tasks.eclipse.dependsOn(tasks.generateProto)
-rootProject.tasks.prepDev.dependsOn(tasks.generateProto)
-
-sourceSets {
- main {
- java {
- srcDir tasks.generateProto.outdir
- }
- }
-}
-zipSourceSubproject.dependsOn generateProto
--
2.42.0

View File

@ -0,0 +1,15 @@
diff --git a/Ghidra/Framework/Utility/src/main/java/utility/application/ApplicationUtilities.java b/Ghidra/Framework/Utility/src/main/java/utility/application/ApplicationUtilities.java
index ea12a661f0..da7779b07f 100644
--- a/Ghidra/Framework/Utility/src/main/java/utility/application/ApplicationUtilities.java
+++ b/Ghidra/Framework/Utility/src/main/java/utility/application/ApplicationUtilities.java
@@ -36,6 +36,10 @@ public class ApplicationUtilities {
*/
public static Collection<ResourceFile> findDefaultApplicationRootDirs() {
Collection<ResourceFile> applicationRootDirs = new ArrayList<>();
+ String nixGhidraHome = System.getenv("NIX_GHIDRAHOME");
+ if (nixGhidraHome != null) {
+ applicationRootDirs.add(new ResourceFile(nixGhidraHome));
+ };
ResourceFile applicationRootDir = findPrimaryApplicationRootDir();
if (applicationRootDir != null) {
applicationRootDirs.add(applicationRootDir);

View File

@ -0,0 +1,26 @@
diff --git a/Ghidra/RuntimeScripts/Common/support/buildExtension.gradle b/Ghidra/RuntimeScripts/Common/support/buildExtension.gradle
index bc194f219..94b00fabd 100644
--- a/Ghidra/RuntimeScripts/Common/support/buildExtension.gradle
+++ b/Ghidra/RuntimeScripts/Common/support/buildExtension.gradle
@@ -82,7 +82,7 @@ dependencies {
helpPath fileTree(dir: ghidraDir + '/Features/Base', include: "**/Base.jar")
}
-def ZIP_NAME_PREFIX = "${DISTRO_PREFIX}_${RELEASE_NAME}_${getCurrentDate()}"
+def ZIP_NAME_PREFIX = "${DISTRO_PREFIX}_${RELEASE_NAME}"
def DISTRIBUTION_DIR = file("dist")
def pathInZip = "${project.name}"
diff --git a/gradle/root/distribution.gradle b/gradle/root/distribution.gradle
index f44c8267b..f6231c417 100644
--- a/gradle/root/distribution.gradle
+++ b/gradle/root/distribution.gradle
@@ -32,7 +32,7 @@ apply from: "$rootProject.projectDir/gradle/support/sbom.gradle"
def currentPlatform = getCurrentPlatformName()
def PROJECT_DIR = file (rootProject.projectDir.absolutePath)
ext.DISTRIBUTION_DIR = file("$buildDir/dist")
-ext.ZIP_NAME_PREFIX = "${rootProject.DISTRO_PREFIX}_${rootProject.BUILD_DATE_SHORT}"
+ext.ZIP_NAME_PREFIX = "${rootProject.DISTRO_PREFIX}"
ext.ZIP_DIR_PREFIX = "${rootProject.DISTRO_PREFIX}"
ext.ALL_REPOS = [rootProject.file('.').getName()]

View File

@ -0,0 +1,78 @@
{ lib
, stdenv
, unzip
, jdk
, gradle
, ghidra
}:
let
metaCommon = oldMeta:
oldMeta // (with lib; {
maintainers = (oldMeta.maintainers or []) ++ (with maintainers; [ vringar ]);
platforms = oldMeta.platforms or ghidra.meta.platforms;
});
buildGhidraExtension = {
pname, nativeBuildInputs ? [], meta ? { }, ...
}@args:
stdenv.mkDerivation (args // {
nativeBuildInputs = nativeBuildInputs ++ [
unzip
jdk
gradle
];
buildPhase = args.buildPhase or ''
runHook preBuild
# Set project name, otherwise defaults to directory name
echo -e '\nrootProject.name = "${pname}"' >> settings.gradle
export GRADLE_USER_HOME=$(mktemp -d)
gradle \
--offline \
--no-daemon \
-PGHIDRA_INSTALL_DIR=${ghidra}/lib/ghidra
runHook postBuild
'';
installPhase = args.installPhase or ''
runHook preInstall
mkdir -p $out/lib/ghidra/Ghidra/Extensions
unzip -d $out/lib/ghidra/Ghidra/Extensions dist/*.zip
runHook postInstall
'';
meta = metaCommon meta;
});
buildGhidraScripts = { pname, meta ? { }, ... }@args:
stdenv.mkDerivation (args // {
installPhase = ''
runHook preInstall
GHIDRA_HOME=$out/lib/ghidra/Ghidra/Extensions/${pname}
mkdir -p $GHIDRA_HOME
cp -r . $GHIDRA_HOME/ghidra_scripts
touch $GHIDRA_HOME/Module.manifest
cat <<'EOF' > extension.properties
name=${pname}
description=${meta.description or ""}
author=
createdOn=
version=${lib.getVersion ghidra}
EOF
runHook postInstall
'';
meta = metaCommon meta;
});
in
{ inherit buildGhidraExtension buildGhidraScripts; }

266
pkgs/ghidra-xenia/build.nix Normal file
View File

@ -0,0 +1,266 @@
{
stdenv,
fetchFromGitHub,
lib,
callPackage,
gradle_7,
perl,
makeBinaryWrapper,
openjdk17,
unzip,
makeDesktopItem,
copyDesktopItems,
desktopToDarwinBundle,
xcbuild,
protobuf,
ghidra-extensions,
python3,
python3Packages,
}:
let
pkg_path = "$out/lib/ghidra";
pname = "ghidra";
version = "11.1.1";
releaseName = "NIX";
distroPrefix = "ghidra_${version}_${releaseName}";
src = fetchFromGitHub {
owner = "NationalSecurityAgency";
repo = "Ghidra";
rev = "Ghidra_${version}_build";
hash = "sha256-t96FcAK3JwO66dOf4OhpOfU8CQfAczfF61Cg7m+B3fA=";
# populate values that require us to use git. By doing this in postFetch we
# can delete .git afterwards and maintain better reproducibility of the src.
leaveDotGit = true;
postFetch = ''
cd "$out"
git rev-parse HEAD > $out/COMMIT
# 1970-Jan-01
date -u -d "@$(git log -1 --pretty=%ct)" "+%Y-%b-%d" > $out/SOURCE_DATE_EPOCH
# 19700101
date -u -d "@$(git log -1 --pretty=%ct)" "+%Y%m%d" > $out/SOURCE_DATE_EPOCH_SHORT
find "$out" -name .git -print0 | xargs -0 rm -rf
'';
};
gradle = gradle_7;
patches = [
# Use our own protoc binary instead of the prebuilt one
./0001-Use-protobuf-gradle-plugin.patch
# Override installation directory to allow loading extensions
./0002-Load-nix-extensions.patch
# Remove build dates from output filenames for easier reference
./0003-Remove-build-datestamp.patch
];
postPatch = ''
# Set name of release (eg. PUBLIC, DEV, etc.)
sed -i -e 's/application\.release\.name=.*/application.release.name=${releaseName}/' Ghidra/application.properties
# Set build date and git revision
echo "application.build.date=$(cat SOURCE_DATE_EPOCH)" >> Ghidra/application.properties
echo "application.build.date.short=$(cat SOURCE_DATE_EPOCH_SHORT)" >> Ghidra/application.properties
echo "application.revision.ghidra=$(cat COMMIT)" >> Ghidra/application.properties
# Tells ghidra to use our own protoc binary instead of the prebuilt one.
cat >>Ghidra/Debug/Debugger-gadp/build.gradle <<HERE
protobuf {
protoc {
path = '${protobuf}/bin/protoc'
}
}
HERE
'';
# Adds a gradle step that downloads all the dependencies to the gradle cache.
addResolveStep = ''
cat >>build.gradle <<HERE
task resolveDependencies {
doLast {
project.rootProject.allprojects.each { subProject ->
subProject.buildscript.configurations.each { configuration ->
resolveConfiguration(subProject, configuration, "buildscript config \''${configuration.name}")
}
subProject.configurations.each { configuration ->
resolveConfiguration(subProject, configuration, "config \''${configuration.name}")
}
}
}
}
void resolveConfiguration(subProject, configuration, name) {
if (configuration.canBeResolved) {
logger.info("Resolving project {} {}", subProject.name, name)
configuration.resolve()
}
}
HERE
'';
# fake build to pre-download deps into fixed-output derivation
# Taken from mindustry derivation.
deps = stdenv.mkDerivation {
pname = "${pname}-deps";
inherit version src patches;
postPatch = addResolveStep;
nativeBuildInputs = [
gradle
perl
] ++ lib.optional stdenv.isDarwin xcbuild;
buildPhase = ''
runHook preBuild
export HOME="$NIX_BUILD_TOP/home"
mkdir -p "$HOME"
export JAVA_TOOL_OPTIONS="-Duser.home='$HOME'"
export GRADLE_USER_HOME="$HOME/.gradle"
# First, fetch the static dependencies.
gradle --no-daemon --info -Dorg.gradle.java.home=${openjdk17} -I gradle/support/fetchDependencies.gradle init
# Then, fetch the maven dependencies.
gradle --no-daemon --info -Dorg.gradle.java.home=${openjdk17} resolveDependencies
runHook postBuild
'';
# perl code mavenizes pathes (com.squareup.okio/okio/1.13.0/a9283170b7305c8d92d25aff02a6ab7e45d06cbe/okio-1.13.0.jar -> com/squareup/okio/okio/1.13.0/okio-1.13.0.jar)
installPhase = ''
runHook preInstall
find $GRADLE_USER_HOME/caches/modules-2 -type f -regex '.*\.\(jar\|pom\)' \
| perl -pe 's#(.*/([^/]+)/([^/]+)/([^/]+)/[0-9a-f]{30,40}/([^/\s]+))$# ($x = $2) =~ tr|\.|/|; "install -Dm444 $1 \$out/maven/$x/$3/$4/$5" #e' \
| sh
cp -r dependencies $out/dependencies
runHook postInstall
'';
outputHashAlgo = "sha256";
outputHashMode = "recursive";
outputHash = "sha256-66gL4UFlBUo2JIEOXoF6tFvXtBdEX4b2MeSrV1b6Vg4=";
};
in
stdenv.mkDerivation (finalAttrs: {
inherit
pname
version
src
patches
postPatch
;
# Don't create .orig files if the patch isn't an exact match.
patchFlags = [
"--no-backup-if-mismatch"
"-p1"
];
desktopItems = [
(makeDesktopItem {
name = "ghidra";
exec = "ghidra";
icon = "ghidra";
desktopName = "Ghidra";
genericName = "Ghidra Software Reverse Engineering Suite";
categories = [ "Development" ];
terminal = false;
startupWMClass = "ghidra-Ghidra";
})
];
nativeBuildInputs =
[
gradle
unzip
makeBinaryWrapper
copyDesktopItems
protobuf
python3
python3Packages.pip
]
++ lib.optionals stdenv.isDarwin [
xcbuild
desktopToDarwinBundle
];
dontStrip = true;
__darwinAllowLocalNetworking = true;
buildPhase = ''
runHook preBuild
export HOME="$NIX_BUILD_TOP/home"
mkdir -p "$HOME"
export JAVA_TOOL_OPTIONS="-Duser.home='$HOME'"
ln -s ${deps}/dependencies dependencies
sed -i "s#mavenLocal()#mavenLocal(); maven { url '${deps}/maven' }#g" build.gradle
gradle --offline --no-daemon --info -Dorg.gradle.java.home=${openjdk17} buildGhidra
runHook postBuild
'';
installPhase = ''
runHook preInstall
mkdir -p "${pkg_path}" "$out/share/applications"
ZIP=build/dist/$(ls build/dist)
echo $ZIP
unzip $ZIP -d ${pkg_path}
f=("${pkg_path}"/*)
mv "${pkg_path}"/*/* "${pkg_path}"
rmdir "''${f[@]}"
for f in Ghidra/Framework/Gui/src/main/resources/images/GhidraIcon*.png; do
res=$(basename "$f" ".png" | cut -d"_" -f3 | cut -c11-)
install -Dm444 "$f" "$out/share/icons/hicolor/''${res}x''${res}/apps/ghidra.png"
done;
# improved macOS icon support
install -Dm444 Ghidra/Framework/Gui/src/main/resources/images/GhidraIcon64.png $out/share/icons/hicolor/32x32@2/apps/ghidra.png
runHook postInstall
'';
postFixup = ''
mkdir -p "$out/bin"
ln -s "${pkg_path}/ghidraRun" "$out/bin/ghidra"
wrapProgram "${pkg_path}/support/launch.sh" \
--set-default NIX_GHIDRAHOME "${pkg_path}/Ghidra" \
--prefix PATH : ${lib.makeBinPath [ openjdk17 ]}
'';
passthru = {
inherit releaseName distroPrefix;
inherit (ghidra-extensions.override { ghidra = finalAttrs.finalPackage; })
buildGhidraExtension
buildGhidraScripts
;
withExtensions = callPackage ./with-extensions.nix { ghidra = finalAttrs.finalPackage; };
};
meta = with lib; {
changelog = "https://htmlpreview.github.io/?https://github.com/NationalSecurityAgency/ghidra/blob/Ghidra_${finalAttrs.version}_build/Ghidra/Configurations/Public_Release/src/global/docs/ChangeHistory.html";
description = "Software reverse engineering (SRE) suite of tools";
mainProgram = "ghidra";
homepage = "https://ghidra-sre.org/";
platforms = [
"x86_64-linux"
"aarch64-linux"
"x86_64-darwin"
"aarch64-darwin"
];
sourceProvenance = with sourceTypes; [
fromSource
binaryBytecode # deps
];
license = licenses.asl20;
maintainers = with maintainers; [
roblabla
vringar
];
broken = stdenv.isDarwin && stdenv.isx86_64;
};
})

View File

@ -0,0 +1,83 @@
{ stdenv
, fetchzip
, lib
, makeWrapper
, autoPatchelfHook
, openjdk17
, pam
, makeDesktopItem
, icoutils
}:
let
pkg_path = "$out/lib/ghidra";
desktopItem = makeDesktopItem {
name = "ghidra";
exec = "ghidra";
icon = "ghidra";
desktopName = "Ghidra";
genericName = "Ghidra Software Reverse Engineering Suite";
categories = [ "Development" ];
terminal = false;
startupWMClass = "ghidra-Ghidra";
};
in stdenv.mkDerivation rec {
pname = "ghidra";
version = "10.4";
versiondate = "20230928";
src = fetchzip {
url = "https://github.com/NationalSecurityAgency/ghidra/releases/download/Ghidra_${version}_build/ghidra_${version}_PUBLIC_${versiondate}.zip";
hash = "sha256-IiAQ9OKmr8ZgqmGftuW0ITdG06fb9Lr30n2H9GArctk=";
};
nativeBuildInputs = [
makeWrapper
icoutils
]
++ lib.optionals stdenv.isLinux [ autoPatchelfHook ];
buildInputs = [
stdenv.cc.cc.lib
pam
];
dontStrip = true;
installPhase = ''
mkdir -p "${pkg_path}"
mkdir -p "${pkg_path}" "$out/share/applications"
cp -a * "${pkg_path}"
ln -s ${desktopItem}/share/applications/* $out/share/applications
icotool -x "${pkg_path}/support/ghidra.ico"
rm ghidra_4_40x40x32.png
for f in ghidra_*.png; do
res=$(basename "$f" ".png" | cut -d"_" -f3 | cut -d"x" -f1-2)
mkdir -pv "$out/share/icons/hicolor/$res/apps"
mv "$f" "$out/share/icons/hicolor/$res/apps/ghidra.png"
done;
'';
postFixup = ''
mkdir -p "$out/bin"
ln -s "${pkg_path}/ghidraRun" "$out/bin/ghidra"
wrapProgram "${pkg_path}/support/launch.sh" \
--prefix PATH : ${lib.makeBinPath [ openjdk17 ]}
'';
meta = with lib; {
description = "Software reverse engineering (SRE) suite of tools developed by NSA's Research Directorate in support of the Cybersecurity mission";
mainProgram = "ghidra";
homepage = "https://github.com/NationalSecurityAgency/ghidra";
platforms = [ "x86_64-linux" "x86_64-darwin" ];
sourceProvenance = with sourceTypes; [ binaryBytecode ];
license = licenses.asl20;
maintainers = with maintainers; [ ck3d govanify mic92 ];
};
}

View File

@ -0,0 +1,14 @@
{ lib, newScope, callPackage, ghidra }:
lib.makeScope newScope (self: {
inherit (callPackage ./build-extension.nix { inherit ghidra; }) buildGhidraExtension buildGhidraScripts;
ghidraninja-ghidra-scripts = self.callPackage ./extensions/ghidraninja-ghidra-scripts { };
gnudisassembler = self.callPackage ./extensions/gnudisassembler { inherit ghidra; };
machinelearning = self.callPackage ./extensions/machinelearning { inherit ghidra; };
sleighdevtools = self.callPackage ./extensions/sleighdevtools { inherit ghidra; };
})

View File

@ -0,0 +1,36 @@
{ lib
, fetchFromGitHub
, buildGhidraScripts
, binwalk
, swift
, yara
}:
buildGhidraScripts {
pname = "ghidraninja-ghidra-scripts";
version = "unstable-2020-10-07";
src = fetchFromGitHub {
owner = "ghidraninja";
repo = "ghidra_scripts";
rev = "99f2a8644a29479618f51e2d4e28f10ba5e9ac48";
sha256 = "aElx0mp66/OHQRfXwTkqdLL0gT2T/yL00bOobYleME8=";
};
postPatch = ''
# Replace subprocesses with store versions
substituteInPlace binwalk.py --replace-fail 'subprocess.call(["binwalk"' 'subprocess.call(["${binwalk}/bin/binwalk"'
substituteInPlace swift_demangler.py --replace-fail '"swift"' '"${swift}/bin/swift"'
substituteInPlace yara.py --replace-fail 'subprocess.check_output(["yara"' 'subprocess.check_output(["${yara}/bin/yara"'
substituteInPlace YaraSearch.py --replace-fail '"yara "' '"${yara}/bin/yara "'
'';
meta = with lib; {
description = "Scripts for the Ghidra software reverse engineering suite";
homepage = "https://github.com/ghidraninja/ghidra_scripts";
license = with licenses; [
gpl3Only
gpl2Only
];
};
}

View File

@ -0,0 +1,71 @@
{ lib
, stdenv
, fetchurl
, buildGhidraExtension
, ghidra
, flex
, bison
, texinfo
, perl
, zlib
, xcbuild
}:
let
# Incorporates source from binutils
# https://github.com/NationalSecurityAgency/ghidra/blob/7ab9bf6abffb6938d61d072040fc34ad3331332b/GPL/GnuDisassembler/build.gradle#L34-L35
binutils-version = "2.41";
binutils-src = fetchurl {
url = "mirror://gnu/binutils/binutils-${binutils-version}.tar.bz2";
sha256 = "sha256-pMS+wFL3uDcAJOYDieGUN38/SLVmGEGOpRBn9nqqsws=";
};
in
buildGhidraExtension {
pname = "gnudisassembler";
version = lib.getVersion ghidra;
src = "${ghidra}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_GnuDisassembler.zip";
postPatch = ''
ln -s ${binutils-src} binutils-${binutils-version}.tar.bz2
'';
# Don't modify ELF stub resources
dontPatchELF = true;
dontStrip = true;
__darwinAllowLocalNetworking = true;
nativeBuildInputs = [
flex
bison
texinfo
perl
] ++ lib.optionals stdenv.hostPlatform.isDarwin [
xcbuild
];
buildInputs = [
zlib
];
installPhase = ''
runHook preInstall
EXTENSIONS_ROOT=$out/lib/ghidra/Ghidra/Extensions
mkdir -p $EXTENSIONS_ROOT
unzip -d $EXTENSIONS_ROOT $src
mkdir -p $EXTENSIONS_ROOT/GnuDisassembler/build
cp -r build/os $EXTENSIONS_ROOT/GnuDisassembler/build/
runHook postInstall
'';
meta = with lib; {
description = "Leverage the binutils disassembler capabilities for various processors";
homepage = "https://ghidra-sre.org/";
downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/GPL/GnuDisassembler";
license = licenses.gpl2Only;
};
}

View File

@ -0,0 +1,34 @@
{ lib
, buildGhidraExtension
, ghidra
}:
buildGhidraExtension {
pname = "machinelearning";
version = lib.getVersion ghidra;
src = "${ghidra}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_MachineLearning.zip";
dontUnpack = true;
# Built as part ghidra
dontBuild = true;
installPhase = ''
runHook preInstall
mkdir -p $out/lib/ghidra/Ghidra/Extensions
unzip -d $out/lib/ghidra/Ghidra/Extensions $src
runHook postInstall
'';
meta = with lib; {
inherit (ghidra.meta) homepage license;
description = "Finds functions using ML";
downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/Ghidra/Extensions/MachineLearning";
sourceProvenance = with sourceTypes; [
fromSource
binaryBytecode # deps
];
};
}

View File

@ -0,0 +1,40 @@
{ lib
, buildGhidraExtension
, ghidra
, python3
}:
buildGhidraExtension {
pname = "sleighdevtools";
version = lib.getVersion ghidra;
src = "${ghidra}/lib/ghidra/Extensions/Ghidra/${ghidra.distroPrefix}_SleighDevTools.zip";
dontUnpack = true;
# Built as part ghidra
dontBuild = true;
buildInputs = [ python3 ];
installPhase = ''
runHook preInstall
mkdir -p $out/lib/ghidra/Ghidra/Extensions
unzip -d $out/lib/ghidra/Ghidra/Extensions $src
runHook postInstall
'';
meta = with lib; {
inherit (ghidra.meta) homepage license;
description = "Sleigh language development tools including external disassembler capabilities";
longDescription = ''
Sleigh language development tools including external disassembler capabilities.
The GnuDisassembler extension may be also be required as a disassembly provider.
'';
downloadPage = "https://github.com/NationalSecurityAgency/ghidra/tree/master/Ghidra/Extensions/SleighDevTools";
sourceProvenance = with sourceTypes; [
fromSource
binaryBytecode # deps
];
};
}

View File

@ -0,0 +1,36 @@
{ lib
, stdenv
, callPackage
, symlinkJoin
, makeBinaryWrapper
, desktopToDarwinBundle
, ghidra
}:
let
ghidra-extensions = callPackage ./extensions.nix { inherit ghidra; };
allExtensions = lib.filterAttrs (n: pkg: lib.isDerivation pkg) ghidra-extensions;
/* Make Ghidra with additional extensions
Example:
pkgs.ghidra.withExtensions (p: with p; [
ghostrings
]);
=> /nix/store/3yn0rbnz5mbrxf0x70jbjq73wgkszr5c-ghidra-with-extensions-10.2.2
*/
withExtensions = f: (symlinkJoin {
name = "${ghidra.pname}-with-extensions-${lib.getVersion ghidra}";
paths = (f allExtensions);
nativeBuildInputs = [ makeBinaryWrapper ]
++ lib.optional stdenv.hostPlatform.isDarwin desktopToDarwinBundle;
postBuild = ''
makeWrapper '${ghidra}/bin/ghidra' "$out/bin/ghidra" \
--set NIX_GHIDRAHOME "$out/lib/ghidra/Ghidra"
ln -s ${ghidra}/share $out/share
'' + lib.optionalString stdenv.hostPlatform.isDarwin ''
convertDesktopFiles $prefix
'';
inherit (ghidra) meta;
});
in
withExtensions

View File

@ -0,0 +1,5 @@
{ kicad }:
{
kikit = kicad.callPackage ./kikit.nix { addonName = "kikit"; };
kikit-library = kicad.callPackage ./kikit.nix { addonName = "kikit-library"; };
}

View File

@ -0,0 +1,52 @@
# For building the multiple addons that are in the kikit repo.
{ stdenv
, bc
, kikit
, zip
, python3
, addonName
, addonPath
}:
let
# This python is only used when building the package, it's not the python
# environment that will ultimately run the code packaged here. The python env defined
# in KiCad will import the python code packaged here when KiCad starts up.
python = python3.withPackages (ps: with ps; [ click ]);
kikit-module = python3.pkgs.toPythonModule (kikit.override { inherit python3; });
# The following different addons can be built from the same source.
targetSpecs = {
"kikit" = {
makeTarget = "pcm-kikit";
resultZip = "pcm-kikit.zip";
description = "KiCad plugin and a CLI tool to automate several tasks in a standard KiCad workflow";
};
"kikit-library" = {
makeTarget = "pcm-lib";
resultZip = "pcm-kikit-lib.zip";
description = "KiKit uses these symbols and footprints to annotate your boards (e.g., to place a tab in a panel).";
};
};
targetSpec = targetSpecs.${addonName};
in
stdenv.mkDerivation {
name = "kicadaddon-${addonName}";
inherit (kikit-module) src version;
nativeBuildInputs = [ python bc zip ];
propagatedBuildInputs = [ kikit-module ];
buildPhase = ''
patchShebangs scripts/setJson.py
make ${targetSpec.makeTarget}
'';
installPhase = ''
mkdir $out
mv build/${targetSpec.resultZip} $out/${addonPath}
'';
meta = kikit-module.meta // {
description = targetSpec.description;
};
}

211
pkgs/kicad-xenia/base.nix Normal file
View File

@ -0,0 +1,211 @@
{ lib
, stdenv
, cmake
, libGLU
, libGL
, zlib
, wxGTK
, gtk3
, libX11
, gettext
, glew
, glm
, cairo
, curl
, openssl
, boost
, pkg-config
, doxygen
, graphviz
, pcre
, libpthreadstubs
, libXdmcp
, unixODBC
, libgit2
, libsecret
, libgcrypt
, libgpg-error
, util-linux
, libselinux
, libsepol
, libthai
, libdatrie
, libxkbcommon
, libepoxy
, dbus
, at-spi2-core
, libXtst
, pcre2
, libdeflate
, swig4
, python
, wxPython
, opencascade-occt_7_6
, libngspice
, valgrind
, stable
, testing
, baseName
, kicadSrc
, kicadVersion
, withNgspice
, withScripting
, withI18n
, debug
, sanitizeAddress
, sanitizeThreads
}:
assert lib.assertMsg (!(sanitizeAddress && sanitizeThreads))
"'sanitizeAddress' and 'sanitizeThreads' are mutually exclusive, use one.";
assert testing -> !stable
-> throw "testing implies stable and cannot be used with stable = false";
let
opencascade-occt = opencascade-occt_7_6;
inherit (lib) optional optionals optionalString;
in
stdenv.mkDerivation rec {
pname = "kicad-base";
version = if (stable) then kicadVersion else builtins.substring 0 10 src.rev;
src = kicadSrc;
patches = [
# upstream issue 12941 (attempted to upstream, but appreciably unacceptable)
./writable.patch
# https://gitlab.com/kicad/code/kicad/-/issues/15687
./runtime_stock_data_path.patch
];
# tagged releases don't have "unknown"
# kicad testing and nightlies use git describe --dirty
# nix removes .git, so its approximated here
postPatch = lib.optionalString (!stable || testing) ''
substituteInPlace cmake/KiCadVersion.cmake \
--replace "unknown" "${builtins.substring 0 10 src.rev}"
substituteInPlace cmake/CreateGitVersionHeader.cmake \
--replace "0000000000000000000000000000000000000000" "${src.rev}"
'';
makeFlags = optionals (debug) [ "CFLAGS+=-Og" "CFLAGS+=-ggdb" ];
cmakeFlags = [
"-DKICAD_USE_EGL=ON"
"-DOCC_INCLUDE_DIR=${opencascade-occt}/include/opencascade"
# https://gitlab.com/kicad/code/kicad/-/issues/17133
"-DCMAKE_CTEST_ARGUMENTS='--exclude-regex;qa_spice'"
]
++ optional (stdenv.hostPlatform.system == "aarch64-linux")
"-DCMAKE_CTEST_ARGUMENTS=--exclude-regex;'qa_spice|qa_cli'"
++ optional (stable && !withNgspice) "-DKICAD_SPICE=OFF"
++ optionals (!withScripting) [
"-DKICAD_SCRIPTING_WXPYTHON=OFF"
]
++ optionals (withI18n) [
"-DKICAD_BUILD_I18N=ON"
]
++ optionals (!doInstallCheck) [
"-DKICAD_BUILD_QA_TESTS=OFF"
]
++ optionals (debug) [
"-DKICAD_STDLIB_DEBUG=ON"
"-DKICAD_USE_VALGRIND=ON"
]
++ optionals (sanitizeAddress) [
"-DKICAD_SANITIZE_ADDRESS=ON"
]
++ optionals (sanitizeThreads) [
"-DKICAD_SANITIZE_THREADS=ON"
];
cmakeBuildType = if debug then "Debug" else "Release";
nativeBuildInputs = [
cmake
doxygen
graphviz
pkg-config
libgit2
libsecret
libgcrypt
libgpg-error
]
# wanted by configuration on linux, doesn't seem to affect performance
# no effect on closure size
++ optionals (stdenv.isLinux) [
util-linux
libselinux
libsepol
libthai
libdatrie
libxkbcommon
libepoxy
dbus
at-spi2-core
libXtst
pcre2
];
buildInputs = [
libGLU
libGL
zlib
libX11
wxGTK
gtk3
pcre
libXdmcp
gettext
glew
glm
libpthreadstubs
cairo
curl
openssl
boost
swig4
python
unixODBC
libdeflate
opencascade-occt
]
++ optional (withScripting) wxPython
++ optional (withNgspice) libngspice
++ optional (debug) valgrind;
# some ngspice tests attempt to write to $HOME/.cache/
# this could be and was resolved with XDG_CACHE_HOME = "$TMP";
# but failing tests still attempt to create $HOME
# and the newer CLI tests seem to also use $HOME...
HOME = "$TMP";
# debug builds fail all but the python test
doInstallCheck = !(debug);
installCheckTarget = "test";
nativeInstallCheckInputs = [
(python.withPackages(ps: with ps; [
numpy
pytest
cairosvg
pytest-image-diff
]))
];
dontStrip = debug;
meta = {
description = "Just the built source without the libraries";
longDescription = ''
Just the build products, the libraries are passed via an env var in the wrapper, default.nix
'';
homepage = "https://www.kicad.org/";
license = lib.licenses.gpl3Plus;
platforms = lib.platforms.all;
};
}

View File

@ -0,0 +1,298 @@
{ lib, stdenv
, runCommand
, newScope
, fetchFromGitLab
, fetchgit
, makeWrapper
, symlinkJoin
, callPackage
, callPackages
, gnome
, dconf
, gtk3
, wxGTK32
, librsvg
, cups
, gsettings-desktop-schemas
, hicolor-icon-theme
, unzip
, jq
, pname ? "kicad"
, stable ? true
, testing ? false
, withNgspice ? !stdenv.isDarwin
, libngspice
, withScripting ? true
, python3
, addons ? [ ]
, debug ? false
, sanitizeAddress ? false
, sanitizeThreads ? false
, with3d ? true
, withI18n ? true
, srcs ? { }
}:
# `addons`: https://dev-docs.kicad.org/en/addons/
#
# ```nix
# kicad = pkgs.kicad.override {
# addons = with pkgs.kicadAddons; [ kikit kikit-library ];
# };
# ```
# The `srcs` parameter can be used to override the kicad source code
# and all libraries, which are otherwise inaccessible
# to overlays since most of the kicad build expression has been
# refactored into base.nix, most of the library build expressions have
# been refactored into libraries.nix. Overrides are only applied when
# building `kicad-unstable`. The `srcs` parameter has
# no effect for stable `kicad`. `srcs` takes an attribute set in which
# any of the following attributes are meaningful (though none are
# mandatory): "kicad", "kicadVersion", "symbols", "templates",
# "footprints", "packages3d", and "libVersion". "kicadVersion" and
# "libVersion" should be set to a string with the desired value for
# the version attribute in kicad's `mkDerivation` and the version
# attribute in any of the library's `mkDerivation`, respectively.
# "kicad", "symbols", "templates", "footprints", and "packages3d"
# should be set to an appropriate fetcher (e.g. `fetchFromGitLab`).
# So, for example, a possible overlay for kicad is:
#
# final: prev:
# {
# kicad-unstable = (prev.kicad-unstable.override {
# srcs = {
# kicadVersion = "2020-10-08";
# kicad = prev.fetchFromGitLab {
# group = "kicad";
# owner = "code";
# repo = "kicad";
# rev = "fd22fe8e374ce71d57e9f683ba996651aa69fa4e";
# sha256 = "sha256-F8qugru/jU3DgZSpQXQhRGNFSk0ybFRkpyWb7HAGBdc=";
# };
# };
# });
# }
let
baseName = if (testing) then "kicad-testing"
else if (stable) then "kicad"
else "kicad-unstable";
versionsImport = import ./versions.nix;
# versions.nix does not provide us with version, src and rev. We
# need to turn this into approprate fetcher calls.
#kicadSrcFetch = fetchFromGitLab {
# group = "kicad";
# owner = "code";
# repo = "kicad";
# rev = versionsImport.${baseName}.kicadVersion.src.rev;
# sha256 = versionsImport.${baseName}.kicadVersion.src.sha256;
#};
kicadSrcFetch = fetchgit {
url = "https://git.lain.faith/haskal/kicad.git";
rev = versionsImport.${baseName}.kicadVersion.src.rev;
sha256 = versionsImport.${baseName}.kicadVersion.src.sha256;
};
libSrcFetch = name: fetchFromGitLab {
group = "kicad";
owner = "libraries";
repo = "kicad-${name}";
rev = versionsImport.${baseName}.libVersion.libSources.${name}.rev;
sha256 = versionsImport.${baseName}.libVersion.libSources.${name}.sha256;
};
# only override `src` or `version` if building `kicad-unstable` with
# the appropriate attribute defined in `srcs`.
srcOverridep = attr: (!stable && builtins.hasAttr attr srcs);
# use default source and version (as defined in versions.nix) by
# default, or use the appropriate attribute from `srcs` if building
# unstable with `srcs` properly defined.
kicadSrc =
if srcOverridep "kicad" then srcs.kicad
else kicadSrcFetch;
kicadVersion =
if srcOverridep "kicadVersion" then srcs.kicadVersion
else versionsImport.${baseName}.kicadVersion.version;
libSrc = name: if srcOverridep name then srcs.${name} else libSrcFetch name;
# TODO does it make sense to only have one version for all libs?
libVersion =
if srcOverridep "libVersion" then srcs.libVersion
else versionsImport.${baseName}.libVersion.version;
wxGTK = wxGTK32;
python = python3;
wxPython = python.pkgs.wxpython;
addonPath = "addon.zip";
addonsDrvs = map (pkg: pkg.override { inherit addonPath python3; }) addons;
addonsJoined =
runCommand "addonsJoined"
{
inherit addonsDrvs;
nativeBuildInputs = [ unzip jq ];
} ''
mkdir $out
for pkg in $addonsDrvs; do
unzip $pkg/addon.zip -d unpacked
folder_name=$(jq .identifier unpacked/metadata.json --raw-output | tr . _)
for d in unpacked/*; do
if [ -d "$d" ]; then
dest=$out/share/kicad/scripting/$(basename $d)/$folder_name
mkdir -p $(dirname $dest)
mv $d $dest
fi
done
rm -r unpacked
done
'';
inherit (lib) concatStringsSep flatten optionalString optionals;
in
stdenv.mkDerivation rec {
# Common libraries, referenced during runtime, via the wrapper.
passthru.libraries = callPackages ./libraries.nix { inherit libSrc; };
passthru.callPackage = newScope { inherit addonPath python3; };
base = callPackage ./base.nix {
inherit stable testing baseName;
inherit kicadSrc kicadVersion;
inherit wxGTK python wxPython;
inherit withNgspice withScripting withI18n;
inherit debug sanitizeAddress sanitizeThreads;
};
inherit pname;
version = if (stable) then kicadVersion else builtins.substring 0 10 src.src.rev;
src = base;
dontUnpack = true;
dontConfigure = true;
dontBuild = true;
dontFixup = true;
pythonPath = optionals (withScripting)
[ wxPython python.pkgs.six python.pkgs.requests ] ++ addonsDrvs;
nativeBuildInputs = [ makeWrapper ]
++ optionals (withScripting)
[ python.pkgs.wrapPython ];
# KICAD7_TEMPLATE_DIR only works with a single path (it does not handle : separated paths)
# but it's used to find both the templates and the symbol/footprint library tables
# https://gitlab.com/kicad/code/kicad/-/issues/14792
template_dir = symlinkJoin {
name = "KiCad_template_dir";
paths = with passthru.libraries; [
"${templates}/share/kicad/template"
"${footprints}/share/kicad/template"
"${symbols}/share/kicad/template"
];
};
# We are emulating wrapGAppsHook3, along with other variables to the wrapper
makeWrapperArgs = with passthru.libraries; [
"--prefix XDG_DATA_DIRS : ${base}/share"
"--prefix XDG_DATA_DIRS : ${hicolor-icon-theme}/share"
"--prefix XDG_DATA_DIRS : ${gnome.adwaita-icon-theme}/share"
"--prefix XDG_DATA_DIRS : ${gtk3}/share/gsettings-schemas/${gtk3.name}"
"--prefix XDG_DATA_DIRS : ${gsettings-desktop-schemas}/share/gsettings-schemas/${gsettings-desktop-schemas.name}"
# wrapGAppsHook3 did these two as well, no idea if it matters...
"--prefix XDG_DATA_DIRS : ${cups}/share"
"--prefix GIO_EXTRA_MODULES : ${dconf}/lib/gio/modules"
# required to open a bug report link in firefox-wayland
"--set-default MOZ_DBUS_REMOTE 1"
"--set-default KICAD8_FOOTPRINT_DIR ${footprints}/share/kicad/footprints"
"--set-default KICAD8_SYMBOL_DIR ${symbols}/share/kicad/symbols"
"--set-default KICAD8_TEMPLATE_DIR ${template_dir}"
]
++ optionals (addons != [ ]) (
let stockDataPath = symlinkJoin {
name = "kicad_stock_data_path";
paths = [
"${base}/share/kicad"
"${addonsJoined}/share/kicad"
];
};
in
[ "--set-default NIX_KICAD8_STOCK_DATA_PATH ${stockDataPath}" ]
)
++ optionals (with3d)
[
"--set-default KICAD8_3DMODEL_DIR ${packages3d}/share/kicad/3dmodels"
]
++ optionals (withNgspice) [ "--prefix LD_LIBRARY_PATH : ${libngspice}/lib" ]
# infinisil's workaround for #39493
++ [ "--set GDK_PIXBUF_MODULE_FILE ${librsvg}/lib/gdk-pixbuf-2.0/2.10.0/loaders.cache" ]
;
# why does $makeWrapperArgs have to be added explicitly?
# $out and $program_PYTHONPATH don't exist when makeWrapperArgs gets set?
installPhase =
let
bin = if stdenv.isDarwin then "*.app/Contents/MacOS" else "bin";
tools = [ "kicad" "pcbnew" "eeschema" "gerbview" "pcb_calculator" "pl_editor" "bitmap2component" ];
utils = [ "dxf2idf" "idf2vrml" "idfcyl" "idfrect" "kicad-cli" ];
in
(concatStringsSep "\n"
(flatten [
"runHook preInstall"
(optionalString (withScripting) "buildPythonPath \"${base} $pythonPath\" \n")
# wrap each of the directly usable tools
(map
(tool: "makeWrapper ${base}/${bin}/${tool} $out/bin/${tool} $makeWrapperArgs"
+ optionalString (withScripting) " --set PYTHONPATH \"$program_PYTHONPATH\""
)
tools)
# link in the CLI utils
(map (util: "ln -s ${base}/${bin}/${util} $out/bin/${util}") utils)
"runHook postInstall"
])
)
;
postInstall = ''
mkdir -p $out/share
ln -s ${base}/share/applications $out/share/applications
ln -s ${base}/share/icons $out/share/icons
ln -s ${base}/share/mime $out/share/mime
ln -s ${base}/share/metainfo $out/share/metainfo
'';
passthru.updateScript = {
command = [ ./update.sh "${pname}" ];
supportedFeatures = [ "commit" ];
};
meta = rec {
description = (if (stable)
then "Open Source Electronics Design Automation suite"
else if (testing) then "Open Source EDA suite, latest on stable branch"
else "Open Source EDA suite, latest on master branch")
+ (lib.optionalString (!with3d) ", without 3D models");
homepage = "https://www.kicad.org/";
longDescription = ''
KiCad is an open source software suite for Electronic Design Automation.
The Programs handle Schematic Capture, and PCB Layout with Gerber output.
'';
license = lib.licenses.gpl3Plus;
maintainers = with lib.maintainers; [ evils ];
platforms = lib.platforms.all;
broken = stdenv.isDarwin;
mainProgram = "kicad";
};
}

View File

@ -0,0 +1,39 @@
{ lib, stdenv
, cmake
, gettext
, libSrc
, stepreduce
, parallel
, zip
}:
let
mkLib = name:
stdenv.mkDerivation {
pname = "kicad-${name}";
version = builtins.substring 0 10 (libSrc name).rev;
src = libSrc name;
nativeBuildInputs = [ cmake ]
++ lib.optionals (name == "packages3d") [
stepreduce
parallel
zip
];
postInstall = lib.optional (name == "packages3d") ''
find $out -type f -name '*.step' | parallel 'stepreduce {} {} && zip -9 {.}.stpZ {} && rm {}'
'';
meta = rec {
license = lib.licenses.cc-by-sa-40;
platforms = lib.platforms.all;
};
};
in
{
symbols = mkLib "symbols";
templates = mkLib "templates";
footprints = mkLib "footprints";
packages3d = mkLib "packages3d";
}

View File

@ -0,0 +1,15 @@
diff --git a/common/paths.cpp b/common/paths.cpp
index a74cdd9..790cc58 100644
--- a/common/paths.cpp
+++ b/common/paths.cpp
@@ -151,6 +151,10 @@ wxString PATHS::GetStockDataPath( bool aRespectRunFromBuildDir )
{
wxString path;
+ if( wxGetEnv( wxT( "NIX_KICAD8_STOCK_DATA_PATH" ), &path ) ) {
+ return path;
+ }
+
if( aRespectRunFromBuildDir && wxGetEnv( wxT( "KICAD_RUN_FROM_BUILD_DIR" ), nullptr ) )
{
// Allow debugging from build dir by placing relevant files/folders in the build root

260
pkgs/kicad-xenia/update.sh Executable file
View File

@ -0,0 +1,260 @@
#!/usr/bin/env nix-shell
#!nix-shell -i bash -p coreutils git nix curl jq
# shellcheck shell=bash enable=all
set -e
shopt -s inherit_errexit
# this script will generate versions.nix in the right location
# this should contain the versions' revs and hashes
# the stable revs are stored only for ease of skipping
# by default nix-prefetch-url uses XDG_RUNTIME_DIR as tmp
# which is /run/user/1000, which defaults to 10% of your RAM
# unless you have over 64GB of ram that'll be insufficient
# resulting in "tar: no space left on device" for packages3d
# hence:
export TMPDIR=/tmp
# if something goes unrepairably wrong, run 'update.sh all clean'
# TODO
# support parallel instances for each pname
# currently risks reusing old data
# no getting around manually checking if the build product works...
# if there is, default to commiting?
# won't work when running in parallel?
# remove items left in /nix/store?
# reuse hashes of already checked revs (to avoid redownloading testing's packages3d)
# nixpkgs' update.nix passes in UPDATE_NIX_PNAME to indicate which package is being updated
# assigning a default value to that as shellcheck doesn't like the use of unassigned variables
: "${UPDATE_NIX_PNAME:=""}"
# update.nix can also parse JSON output of this script to formulate a commit
# this requires we collect the version string in the old versions.nix for the updated package
old_version=""
new_version=""
# get the latest tag that isn't an RC or *.99
latest_tags="$(git ls-remote --tags --sort -version:refname https://gitlab.com/kicad/code/kicad.git)"
# using a scratch variable to ensure command failures get caught (SC2312)
scratch="$(grep -o 'refs/tags/[0-9]*\.[0-9]*\.[0-9]*$' <<< "${latest_tags}")"
scratch="$(grep -ve '\.99' -e '\.9\.9' <<< "${scratch}")"
scratch="$(sed -n '1p' <<< "${scratch}")"
latest_tag="$(cut -d '/' -f 3 <<< "${scratch}")"
# get the latest branch name for testing
branches="$(git ls-remote --heads --sort -version:refname https://gitlab.com/kicad/code/kicad.git)"
scratch="$(grep -o 'refs/heads/[0-9]*\.[0-9]*$' <<< "${branches}")"
scratch="$(sed -n '1p' <<< "${scratch}")"
testing_branch="$(cut -d '/' -f 3 <<< "${scratch}")"
# "latest_tag" and "master" directly refer to what we want
# "testing" uses "testing_branch" found above
all_versions=( "${latest_tag}" testing master )
prefetch="nix-prefetch-url --unpack --quiet"
clean=""
check_stable=""
check_testing=1
check_unstable=1
commit=""
for arg in "$@" "${UPDATE_NIX_PNAME}"; do
case "${arg}" in
help|-h|--help) echo "Read me!" >&2; exit 1; ;;
kicad|kicad-small|release|tag|stable|5*|6*|7*|8*) check_stable=1; check_testing=""; check_unstable="" ;;
*testing|kicad-testing-small) check_testing=1; check_unstable="" ;;
*unstable|*unstable-small|master|main) check_unstable=1; check_testing="" ;;
latest|now|today) check_unstable=1; check_testing=1 ;;
all|both|full) check_stable=1; check_testing=1; check_unstable=1 ;;
clean|fix|*fuck) check_stable=1; check_testing=1; check_unstable=1; clean=1 ;;
commit) commit=1 ;;
*) ;;
esac
done
here="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
commit_date() {
gitlab_json="$(curl -s https://gitlab.com/api/v4/projects/kicad%2Fcode%2Fkicad/repository/commits/"$1")"
commit_created="$(jq .created_at --raw-output <<< "${gitlab_json}")"
date --date="${commit_created}" --iso-8601 --utc
}
file="${here}/versions.nix"
# just in case this runs in parallel
tmp="${here}/,versions.nix.${RANDOM}"
libs=( symbols templates footprints packages3d )
get_rev() {
git ls-remote "$@"
}
gitlab="https://gitlab.com/kicad"
# append commit hash or tag
src_pre="https://gitlab.com/api/v4/projects/kicad%2Fcode%2Fkicad/repository/archive.tar.gz?sha="
lib_pre="https://gitlab.com/api/v4/projects/kicad%2Flibraries%2Fkicad-"
lib_mid="/repository/archive.tar.gz?sha="
# number of items updated
count=0
printf "Latest tag is %s\n" "${latest_tag}" >&2
if [[ ! -f ${file} ]]; then
echo "No existing file, generating from scratch" >&2
check_stable=1; check_testing=1; check_unstable=1; clean=1
fi
printf "Writing %s\n" "${tmp}" >&2
# not a dangling brace, grouping the output to redirect to file
{
printf "# This file was generated by update.sh\n\n"
printf "{\n"
for version in "${all_versions[@]}"; do
src_version=${version};
lib_version=${version};
# testing is the stable branch on the main repo
# but the libraries don't have such a branch
# only the latest release tag and a master branch
if [[ ${version} == "testing" ]]; then
src_version=${testing_branch};
lib_version=${latest_tag};
fi
if [[ ${version} == "master" ]]; then
pname="kicad-unstable"
elif [[ ${version} == "testing" ]]; then
pname="kicad-testing"
else
pname="kicad"
fi
# skip a version if we don't want to check it
if [[ (-n ${check_stable} && ${version} != "master" && ${version} != "testing") \
|| (-n ${check_testing} && ${version} == "testing") \
|| (-n ${check_unstable} && ${version} == "master" ) ]]; then
now=$(commit_date "${src_version}")
if [[ ${version} == "master" ]]; then
pname="kicad-unstable"
new_version="${now}"
elif [[ ${version} == "testing" ]]; then
pname="kicad-testing"
new_version="${testing_branch}-${now}"
else
pname="kicad"
new_version="${version}"
fi
printf "\nChecking %s\n" "${pname}" >&2
printf "%2s\"%s\" = {\n" "" "${pname}"
printf "%4skicadVersion = {\n" ""
printf "%6sversion =\t\t\t\"%s\";\n" "" "${new_version}"
printf "%6ssrc = {\n" ""
echo "Checking src" >&2
scratch="$(get_rev "${gitlab}"/code/kicad.git "${src_version}")"
src_rev="$(cut -f1 <<< "${scratch}")"
has_rev="$(grep -sm 1 "\"${pname}\"" -A 4 "${file}" | grep -sm 1 "${src_rev}" || true)"
has_hash="$(grep -sm 1 "\"${pname}\"" -A 5 "${file}" | grep -sm 1 "sha256" || true)"
old_version="$(grep -sm 1 "\"${pname}\"" -A 3 "${file}" | grep -sm 1 "version" | awk -F "\"" '{print $2}' || true)"
if [[ -n ${has_rev} && -n ${has_hash} && -z ${clean} ]]; then
echo "Reusing old ${pname}.src.sha256, already latest .rev at ${old_version}" >&2
scratch=$(grep -sm 1 "\"${pname}\"" -A 5 "${file}")
grep -sm 1 "rev" -A 1 <<< "${scratch}"
else
prefetched="$(${prefetch} "${src_pre}${src_rev}")"
printf "%8srev =\t\t\t\"%s\";\n" "" "${src_rev}"
printf "%8ssha256 =\t\t\"%s\";\n" "" "${prefetched}"
count=$((count+1))
fi
printf "%6s};\n" ""
printf "%4s};\n" ""
printf "%4slibVersion = {\n" ""
printf "%6sversion =\t\t\t\"%s\";\n" "" "${new_version}"
printf "%6slibSources = {\n" ""
for lib in "${libs[@]}"; do
echo "Checking ${lib}" >&2
url="${gitlab}/libraries/kicad-${lib}.git"
scratch="$(get_rev "${url}" "${lib_version}")"
scratch="$(cut -f1 <<< "${scratch}")"
lib_rev="$(tail -n1 <<< "${scratch}")"
has_rev="$(grep -sm 1 "\"${pname}\"" -A 19 "${file}" | grep -sm 1 "${lib_rev}" || true)"
has_hash="$(grep -sm 1 "\"${pname}\"" -A 20 "${file}" | grep -sm 1 "${lib}.sha256" || true)"
if [[ -n ${has_rev} && -n ${has_hash} && -z ${clean} ]]; then
echo "Reusing old kicad-${lib}-${new_version}.src.sha256, already latest .rev" >&2
scratch="$(grep -sm 1 "\"${pname}\"" -A 20 "${file}")"
grep -sm 1 "${lib}" -A 1 <<< "${scratch}"
else
prefetched="$(${prefetch} "${lib_pre}${lib}${lib_mid}${lib_rev}")"
printf "%8s%s.rev =\t" "" "${lib}"
case "${lib}" in
symbols|templates) printf "\t" ;; *) ;;
esac
printf "\"%s\";\n" "${lib_rev}"
printf "%8s%s.sha256 =\t\"%s\";\n" "" "${lib}" "${prefetched}"
count=$((count+1))
fi
done
printf "%6s};\n" ""
printf "%4s};\n" ""
printf "%2s};\n" ""
else
printf "\nReusing old %s\n" "${pname}" >&2
grep -sm 1 "\"${pname}\"" -A 21 "${file}"
fi
done
printf "}\n"
} > "${tmp}"
if grep '""' "${tmp}"; then
echo "empty value detected, out of space?" >&2
exit "1"
fi
mv "${tmp}" "${file}"
printf "\nFinished\nMoved output to %s\n\n" "${file}" >&2
if [[ ${count} -gt 0 ]]; then
if [[ ${count} -gt 1 ]]; then s="s"; else s=""; fi
echo "${count} revision${s} changed" >&2
if [[ -n ${commit} ]]; then
git commit -am "$(printf "kicad: automatic update of %s item%s\n" "${count}" "${s}")"
fi
echo "Please confirm the new versions.nix works before making a PR." >&2
else
echo "No changes, those checked are up to date" >&2
fi
# using UPDATE_NIX_ATTR_PATH to detect if this is being called from update.nix
# and output JSON to describe the changes
if [[ -n ${UPDATE_NIX_ATTR_PATH} ]]; then
if [[ ${count} -eq 0 ]]; then echo "[{}]"; exit 0; fi
jq -n \
--arg attrpath "${UPDATE_NIX_PNAME}" \
--arg oldversion "${old_version}" \
--arg newversion "${new_version}" \
--arg file "${file}" \
'[{
"attrPath": $attrpath,
"oldVersion": $oldversion,
"newVersion": $newversion,
"files": [ $file ]
}]'
fi

View File

@ -0,0 +1,70 @@
# This file was generated by update.sh
{
"kicad" = {
kicadVersion = {
version = "8.0.2";
src = {
rev = "2d5434e9abf570ffd19b22c90963ea71cfb91d3d";
sha256 = "1n1jj7559xd4ib4c6ybya75a5hbarnkfy8gxzxfw58wdb4lxxmzz";
};
};
libVersion = {
version = "8.0.2";
libSources = {
symbols.rev = "099ac0c8ac402a685fde00b1369e34a116e29661";
symbols.sha256 = "0w333f89yw2m0zlpkg0k6hfwlj10snm8laihdjnsb22asyz4pbhn";
templates.rev = "2e2da58e02707d327d59d4101c401a82dc9a26f6";
templates.sha256 = "073a6cyvzzy0vmkj3ip4ziq7b7pcizs70nm5acw838dxghjfyv3v";
footprints.rev = "e8c30550cde4945cbe1bf30cccf0b3c1e2bda6c6";
footprints.sha256 = "10j8qjljc1fv8k4zp3zn0da33g57hn6pgrgmbgp18dsa539xvxcz";
packages3d.rev = "249f7947587529026e1676cd70c8d7493a8d8162";
packages3d.sha256 = "04gvfb54jhnww2qwrxc27wpyrvmjasdc4xhr0ridl7dglh4qcp35";
};
};
};
# "kicad-testing" = {
# kicadVersion = {
# version = "8.0-2024-02-23";
# src = {
# rev = "14d71c8ca6b48d2eb956bb069acf05a37b1b2652";
# sha256 = "0xqd0xbpnvsvba75526nwgzr8l2cfxy99sjmg13sjxfx7rq16kqi";
# };
# };
# libVersion = {
# version = "8.0-2024-02-23";
# libSources = {
# symbols.rev = "e228d4e8b295364e90e36c57f4023d8285ba88cd";
# symbols.sha256 = "049h2a7yn6ks8sybppixa872dbvyd0rwf9r6nixvdg6d13fl6rwf";
# templates.rev = "2e00c233b67e35323f90d04c190bf70237a252f2";
# templates.sha256 = "0m9bggz3cm27kqpjjwxy19mqzk0c69bywcjkqcni7kafr21c6k4z";
# footprints.rev = "6e5329a6d4aaa81290e23af3eba88f505c2f61b0";
# footprints.sha256 = "0ypjlbmzmcl3pha3q2361va70c988b1drxy8320gm66jkzfc21a1";
# packages3d.rev = "d1e521228d9f5888836b1a6a35fb05fb925456fa";
# packages3d.sha256 = "0lcy1av7ixg1f7arflk50jllpc1749sfvf3h62hkxsz97wkr97xj";
# };
# };
# };
# "kicad-unstable" = {
# kicadVersion = {
# version = "2024-02-23";
# src = {
# rev = "b7b64d959f37f00bb0d14b007c3b3908196e1024";
# sha256 = "1gl7mjqpmqq4m55z6crwb77983g00gi2161ichsc7hsfhs4c8grh";
# };
# };
# libVersion = {
# version = "2024-02-23";
# libSources = {
# symbols.rev = "8b0c343d8694fe0a968e5c4af69fd161bacf7da1";
# symbols.sha256 = "049h2a7yn6ks8sybppixa872dbvyd0rwf9r6nixvdg6d13fl6rwf";
# templates.rev = "0a6c4f798a68a5c639d54b4d3093460ab9267816";
# templates.sha256 = "0m9bggz3cm27kqpjjwxy19mqzk0c69bywcjkqcni7kafr21c6k4z";
# footprints.rev = "ded6b053460faae5783c538a38e91e2b4bddcf2e";
# footprints.sha256 = "035bf37n4vrihaj4zfdncisdx9fly1vya7lhkxhlsbv5blpi4a5y";
# packages3d.rev = "984667325076d4e50dab14e755aeacf97f42194c";
# packages3d.sha256 = "0lkaxv02h4sxrnm8zr17wl9d07mazlisad78r35gry741i362cdg";
# };
# };
# };
}

View File

@ -0,0 +1,49 @@
commit 6a72fd032405515e468797be91b5a6ebcbbb5fd8
Author: Evils <evils.devils@protonmail.com>
Date: Wed Nov 23 19:49:13 2022 +0100
ensure new projects are writable
diff --git a/kicad/kicad_manager_frame.cpp b/kicad/kicad_manager_frame.cpp
index 7ee8090858..391514519c 100644
--- a/kicad/kicad_manager_frame.cpp
+++ b/kicad/kicad_manager_frame.cpp
@@ -638,6 +638,12 @@ void KICAD_MANAGER_FRAME::CreateNewProject( const wxFileName& aProjectFileName,
// wxFFile dtor will close the file
}
+
+ if( destFileName.IsOk() && !destFileName.IsFileWritable() )
+ {
+ destFileName.SetPermissions(0644);
+ }
+
}
}
diff --git a/kicad/project_template.cpp b/kicad/project_template.cpp
index bf951fcddb..2bef94326b 100644
--- a/kicad/project_template.cpp
+++ b/kicad/project_template.cpp
@@ -282,6 +282,21 @@ bool PROJECT_TEMPLATE::CreateProject( wxFileName& aNewProjectPath, wxString* aEr
result = false;
}
+ else if( !destFile.IsFileWritable() && !destFile.SetPermissions(0644) )
+ {
+ if( aErrorMsg )
+ {
+ if( !aErrorMsg->empty() )
+ *aErrorMsg += "\n";
+
+ wxString msg;
+
+ msg.Printf( _( "Cannot make file writable: '%s'." ), destFile.GetFullPath() );
+ *aErrorMsg += msg;
+ }
+
+ result = false;
+ }
}
return result;

View File

@ -1,199 +0,0 @@
From d9e022548aff94e90914baa921ddb4cd939c0e5c Mon Sep 17 00:00:00 2001
From: xenia <xenia@awoo.systems>
Date: Sat, 21 Dec 2024 15:33:10 -0500
Subject: [PATCH] implement lix support
---
CMakeLists.txt | 27 ------------
extra-builtins.cc | 91 ++++++++++++++++-------------------------
meson.build | 18 ++++++++
nix-plugins-config.h.in | 3 --
4 files changed, 53 insertions(+), 86 deletions(-)
delete mode 100644 CMakeLists.txt
create mode 100644 meson.build
delete mode 100644 nix-plugins-config.h.in
diff --git a/CMakeLists.txt b/CMakeLists.txt
deleted file mode 100644
index 9674fe8..0000000
--- a/CMakeLists.txt
+++ /dev/null
@@ -1,27 +0,0 @@
-cmake_minimum_required (VERSION 3.9)
-project (nix-plugins)
-set (nix-plugins_VERSION_MAJOR 15)
-set (nix-plugins_VERSION_MINOR 0)
-set (nix-plugins_VERSION_PATCH 0)
-
-find_package(PkgConfig)
-
-pkg_check_modules(NIX REQUIRED nix-expr>=2.24 nix-main>=2.24 nix-store>=2.24)
-
-find_path(BOOST_INCLUDE_DIR boost/format.hpp)
-if(BOOST_INCLUDE_DIR STREQUAL "BOOST_INCLUDE_DIR-NOTFOUND")
- message(FATAL_ERROR "Could not find Boost formatting library.")
-endif()
-include_directories(${BOOST_INCLUDE_DIR})
-
-if(APPLE)
- set(CMAKE_MODULE_LINKER_FLAGS "${CMAKE_MODULE_LINKER_FLAGS} -flat_namespace -undefined suppress")
-endif()
-
-add_library(nix-extra-builtins MODULE extra-builtins.cc)
-configure_file(nix-plugins-config.h.in nix-plugins-config.h)
-target_include_directories(nix-extra-builtins PUBLIC ${CMAKE_CURRENT_BINARY_DIR})
-target_include_directories(nix-extra-builtins PUBLIC ${NIX_INCLUDE_DIRS})
-target_compile_options(nix-extra-builtins PUBLIC ${NIX_CFLAGS_OTHER})
-
-install(TARGETS nix-extra-builtins DESTINATION lib/nix/plugins)
diff --git a/extra-builtins.cc b/extra-builtins.cc
index 3a0f90e..95aef5e 100644
--- a/extra-builtins.cc
+++ b/extra-builtins.cc
@@ -1,12 +1,8 @@
-#include <config.h>
-#include <primops.hh>
-#include <globals.hh>
-#include <config-global.hh>
-#include <eval-settings.hh>
-#include <common-eval-args.hh>
-#include <filtering-source-accessor.hh>
-
-#include "nix-plugins-config.h"
+#include <lix/config.h>
+#include <lix/libexpr/primops.hh>
+#include <lix/libstore/globals.hh>
+#include <lix/libexpr/eval-settings.hh>
+#include <lix/libcmd/common-eval-args.hh>
using namespace nix;
@@ -21,42 +17,41 @@ static ExtraBuiltinsSettings extraBuiltinsSettings;
static GlobalConfig::Register rp(&extraBuiltinsSettings);
-static void extraBuiltins(EvalState & state, const PosIdx pos,
+static void extraBuiltins(EvalState & state,
Value ** _args, Value & v)
{
- static auto extraBuiltinsFile = state.rootPath(CanonPath(extraBuiltinsSettings.extraBuiltinsFile.to_string()));
- if (auto rootFS2 = state.rootFS.dynamic_pointer_cast<AllowListSourceAccessor>())
- rootFS2->allowPrefix(CanonPath(extraBuiltinsFile.path.abs()));
+ static auto extraBuiltinsFile =
+ SourcePath(CanonPath(extraBuiltinsSettings.extraBuiltinsFile.to_string()));
try {
- auto fun = state.allocValue();
- state.evalFile(extraBuiltinsFile, *fun);
- Value * arg;
- if (evalSettings.enableNativeCode) {
- arg = state.baseEnv.values[0];
- } else {
- auto attrs = state.buildBindings(2);
-
- auto sExec = state.symbols.create("exec");
- attrs.alloc(sExec).mkPrimOp(new PrimOp {
- .name = "exec",
- .arity = 1,
- .fun = prim_exec,
- });
-
- auto sImportNative = state.symbols.create("importNative");
- attrs.alloc(sImportNative).mkPrimOp(new PrimOp {
- .name = "importNative",
- .arity = 2,
- .fun = prim_importNative,
- });
-
- arg = state.allocValue();
- arg->mkAttrs(attrs);
- }
+ auto fun = state.ctx.mem.allocValue();
+
+ // bypass the source path checking by directly reading and evaluating the file
+ // this also bypasses the eval cache but oh well
+ Expr& e = state.ctx.parseExprFromFile(extraBuiltinsFile.unsafeIntoChecked());
+ state.eval(e, *fun);
+
+ auto attrs = state.ctx.buildBindings(2);
+
+ auto sExec = state.ctx.symbols.create("exec");
+ attrs.alloc(sExec).mkPrimOp(new PrimOp {
+ .name = "exec",
+ .arity = 1,
+ .fun = prim_exec,
+ });
+
+ auto sImportNative = state.ctx.symbols.create("importNative");
+ attrs.alloc(sImportNative).mkPrimOp(new PrimOp {
+ .name = "importNative",
+ .arity = 2,
+ .fun = prim_importNative,
+ });
+
+ Value* arg = state.ctx.mem.allocValue();
+ arg->mkAttrs(attrs);
v.mkApp(fun, arg);
- state.forceValue(v, pos);
- } catch (FileNotFound &) {
+ state.forceValue(v, noPos);
+ } catch (SysError &) {
v.mkNull();
}
}
@@ -66,19 +61,3 @@ static RegisterPrimOp rp1({
.arity = 0,
.fun = extraBuiltins,
});
-
-static void cflags(EvalState & state, const PosIdx _pos,
- Value ** _args, Value & v)
-{
- auto attrs = state.buildBindings(3);
- attrs.alloc("NIX_INCLUDE_DIRS").mkString(NIX_INCLUDE_DIRS);
- attrs.alloc("NIX_CFLAGS_OTHER").mkString(NIX_CFLAGS_OTHER);
- attrs.alloc("BOOST_INCLUDE_DIR").mkString(BOOST_INCLUDE_DIR);
- v.mkAttrs(attrs);
-}
-
-static RegisterPrimOp rp2({
- .name = "__nix-cflags",
- .arity = 0,
- .fun = cflags,
-});
diff --git a/meson.build b/meson.build
new file mode 100644
index 0000000..0be6ce6
--- /dev/null
+++ b/meson.build
@@ -0,0 +1,18 @@
+project('lix-plugins',
+ ['c', 'cpp'],
+ default_options: ['cpp_std=gnu++20'],
+ version: '15.0.0')
+
+cpp = meson.get_compiler('cpp')
+pkgconfig = import('pkgconfig')
+
+lix_expr = dependency('lix-expr', version: '>=2.91')
+lix_store = dependency('lix-store', version: '>=2.91')
+lix_cmd = dependency('lix-cmd', version: '>=2.91')
+lix_main = dependency('lix-main', version: '>=2.91')
+boost = dependency('boost')
+
+library('lix-plugins',
+ 'extra-builtins.cc',
+ dependencies: [lix_expr, lix_store, lix_cmd, lix_main, boost],
+ install: true)
diff --git a/nix-plugins-config.h.in b/nix-plugins-config.h.in
deleted file mode 100644
index 459fea8..0000000
--- a/nix-plugins-config.h.in
+++ /dev/null
@@ -1,3 +0,0 @@
-#define NIX_INCLUDE_DIRS "@NIX_INCLUDE_DIRS@"
-#define NIX_CFLAGS_OTHER "@NIX_CFLAGS_OTHER@"
-#define BOOST_INCLUDE_DIR "@BOOST_INCLUDE_DIR@"
--
2.49.0

View File

@ -1,44 +0,0 @@
{
lib,
fetchFromGitHub,
stdenv,
meson,
ninja,
pkg-config,
lix,
capnproto,
boost182,
}: stdenv.mkDerivation {
name = "lix-plugins";
src = fetchFromGitHub {
owner = "shlevy";
repo = "nix-plugins";
rev = "15.0.0";
hash = "sha256-C4VqKHi6nVAHuXVhqvTRRyn0Bb619ez4LzgUWPH1cbM=";
};
patches = [ ./0001-implement-lix-support.patch ];
mesonBuildType = "release";
nativeBuildInputs = [
meson
ninja
pkg-config
];
buildInputs = [
lix
boost182
capnproto
];
meta = {
description = "Collection of miscellaneous plugins for the nix expression language.";
homepage = "https://github.com/shlevy/nix-plugins";
license = lib.licenses.mit;
maintainers = [];
platforms = lib.platforms.all;
};
}

View File

@ -1,32 +0,0 @@
{
fetchurl,
lib,
stdenvNoCC,
ocaml,
version ? lib.versions.majorMinor ocaml.version,
}: stdenvNoCC.mkDerivation {
pname = "ocaml-manual";
inherit version;
src = fetchurl {
url = "http://caml.inria.fr/distrib/ocaml-${version}/ocaml-${version}-refman-html.tar.gz";
hash = "sha256-NhtwltAJKxG5bwvu4hevK4xv45gRRaLEtNQ9ZW5NyvU=";
};
buildPhase = "";
installPhase = ''
mkdir -p "$out/share/doc/ocaml"
cp -r . "$out/share/doc/ocaml/."
'';
meta = {
description = "Offline manual for OCaml";
homepage = "https://ocaml.org";
license = lib.licenses.lgpl21Only;
maintainers = [];
platforms = lib.platforms.all;
};
}

View File

@ -1,12 +0,0 @@
{
runCommand,
patdiff,
}: runCommand "patdiff-bin-${patdiff.version}" {
nativeBuildInputs = [];
strictDeps = true;
meta.mainProgram = "patdiff";
} ''
mkdir -p $out
cp -r ${patdiff}/bin $out
cp -r ${patdiff}/share $out
''

View File

@ -1,5 +1,4 @@
{ {
lib,
fetchgit, fetchgit,
buildDunePackage, buildDunePackage,
@ -21,12 +20,4 @@ buildDunePackage rec {
nativeBuildInputs = [ ppxlib ]; nativeBuildInputs = [ ppxlib ];
propagatedBuildInputs = [ ppxlib uunf ]; propagatedBuildInputs = [ ppxlib uunf ];
meta = {
description = "opinionated ppx for string literals";
homepage = "https://git.lain.faith/haskal/ppx_unicode";
license = lib.licenses.fyptl;
maintainers = [];
platforms = with lib.platforms; linux ++ darwin;
};
} }

View File

@ -1,38 +0,0 @@
{
lib,
fetchgit,
buildDunePackage,
cstruct,
dune-configurator,
eio,
eio_linux,
eio_main,
ppx_unicode,
ptime,
xlog,
}:
buildDunePackage rec {
pname = "systemd-ml";
version = "0.1.0";
src = fetchgit {
url = "https://git.lain.faith/haskal/systemd-ml.git";
rev = version;
hash = "sha256-IkWBObwQJF5wum46OsLTH1wmPqWnF5/UuTnBFbs/o/0=";
};
minimalOcamlVersion = "5.1";
dontStrip = true;
nativeBuildInputs = [ dune-configurator ppx_unicode ];
propagatedBuildInputs = [ cstruct dune-configurator eio eio_linux eio_main ppx_unicode ptime xlog ];
meta = {
description = "systemd-ml provides libsystemd-like functionality for interacting with the systemd service manager, in self-contained ocaml code (with a bit of C).";
homepage = "https://git.lain.faith/haskal/systemd-ml";
license = lib.licenses.fyptl;
maintainers = [];
platforms = lib.platforms.linux;
};
}

View File

@ -1,5 +1,4 @@
{ {
lib,
fetchgit, fetchgit,
buildDunePackage, buildDunePackage,
@ -23,12 +22,4 @@ buildDunePackage rec {
buildInputs = [ ppx_unicode ]; buildInputs = [ ppx_unicode ];
propagatedBuildInputs = [ ptime ppxlib ]; propagatedBuildInputs = [ ptime ppxlib ];
nativeBuildInputs = [ ppxlib ppx_unicode ]; nativeBuildInputs = [ ppxlib ppx_unicode ];
meta = {
description = "logging library for cats written in ocaml";
homepage = "https://git.lain.faith/haskal/xlog";
license = lib.licenses.lgpl2Plus;
maintainers = [];
platforms = with lib.platforms; linux ++ darwin;
};
} }

View File

@ -58,13 +58,4 @@ in buildPythonPackage rec {
doCheck = false; doCheck = false;
pythonImportsCheck = [ "feedvalidator" ]; pythonImportsCheck = [ "feedvalidator" ];
meta = {
description = "W3C-customized version of the feedvalidator";
homepage = "https://github.com/w3c/feedvalidator";
license = lib.licenses.mit;
maintainers = [];
mainProgram = "feedvalidator";
platforms = lib.platforms.all;
};
} }

View File

@ -1,21 +0,0 @@
{
fetchPypi,
buildPythonPackage,
pyserial,
pyserial-asyncio,
}: buildPythonPackage rec {
pname = "megacom";
version = "0.1.2";
src = fetchPypi {
inherit pname version;
sha256 = "sha256-q2sU37uTX98RJDF0WFt7vzqtfLk3u25COCdKt34/Z70=";
};
dependencies = [
pyserial
pyserial-asyncio
];
doCheck = false;
}

View File

@ -1,25 +0,0 @@
From ff4fb2534bae3dfe9ed12f323d23fc9df17ea447 Mon Sep 17 00:00:00 2001
From: xenia <xenia@awoo.systems>
Date: Mon, 7 Apr 2025 12:40:59 -0400
Subject: [PATCH] Fix KDE window icon
---
pympress/app.py | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/pympress/app.py b/pympress/app.py
index 7f5e3b7..6286d3e 100644
--- a/pympress/app.py
+++ b/pympress/app.py
@@ -101,7 +101,7 @@ class Pympress(Gtk.Application):
def __init__(self):
GLib.set_application_name('pympress')
- # GLib.set_prgname('pympress') # Let prgname be auto-determined from sys.argv[0]
+ GLib.set_prgname('io.github.pympress')
Gtk.Application.__init__(self, application_id='io.github.pympress',
flags=Gio.ApplicationFlags.HANDLES_OPEN | Gio.ApplicationFlags.CAN_OVERRIDE_APP_ID)
--
2.47.2

View File

@ -1,73 +0,0 @@
{
lib,
racket,
racketInstallHook,
stdenv,
wrapGAppsHook3,
}: lib.extendMkDerivation {
constructDrv = stdenv.mkDerivation;
excludeDrvArgNames = [
"dependencies"
"tetheredInstallation"
"doMainSetup"
"buildDocs"
"gitSubpath"
];
extendDrvArgs = finalAttrs:
{
pname,
version,
nativeBuildInputs ? [],
propagatedBuildInputs ? [],
dependencies ? [],
tetheredInstallation ? false,
doMainSetup ? tetheredInstallation,
buildDocs ? tetheredInstallation,
gitSubpath ? ".",
...
} @ attrs: {
name = "racket${racket.version}-" + pname + "-" + version;
strictDeps = true;
dontConfigure = true;
dontBuild = true;
racketTetheredInstallation = tetheredInstallation;
racketDoMainSetup = doMainSetup;
racketBuildDocs = buildDocs;
racketGitSubpath = gitSubpath;
nativeBuildInputs = [
racket
racketInstallHook
wrapGAppsHook3
] ++ nativeBuildInputs;
propagatedBuildInputs = [racket] ++ dependencies ++ propagatedBuildInputs;
dontWrapGApps = true;
preFixup = ''
find $out/bin -type f -executable -print0 |
while IFS= read -r -d ''' f; do
if test "$(file --brief --mime-type "$f")" = application/x-executable; then
wrapGApp "$f"
fi
done
'' + (lib.optionalString (!tetheredInstallation) ''
find $out/bin -type f -executable -print0 |
while IFS= read -r -d ''' f; do
if test "$(file --brief --mime-type "$f")" = text/x-shellscript; then
substituteInPlace "$f" \
--replace-fail "\"\''${bindir}/racket\"" \
"\"\''${bindir}/racket\" --config $out/etc/racket/"
fi
done
'');
};
}

View File

@ -1,17 +0,0 @@
{
lib,
racket,
buildRacketPackage,
}: {
packages,
}: buildRacketPackage {
pname = "env";
version = "0";
unpackPhase = "touch nix-racket-env-only";
dependencies = packages;
tetheredInstallation = true;
racketEnvOnly = true;
}

View File

@ -1,28 +0,0 @@
{
lib,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "ansi-color";
version = "0.2+20363d9";
dependencies = [];
src = fetchFromGitHub {
owner = "renatoathaydes";
repo = "ansi-color";
rev = "20363d90fcef9219580ec0d6a78eea834df39d21";
hash = "sha256-PdTF4KaDecp7hYHlUAXXmZEfuvEfSF6Gf9A558b6v/I=";
};
gitSubpath = ".";
passthru = {
racketModules = ["ansi-color/main.rkt" "ansi-color/display.rkt" "ansi-color/scribblings/ansi-color.scrbl" "ansi-color/demo.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "A library to make it easy to write colorized and styled output in terminals that support ANSI escape codes (most command lines).";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."LGPL-3.0-or-later")];
homepage = "https://github.com/renatoathaydes/ansi-color/tree/HEAD/README.md";
};
})

View File

@ -1,26 +0,0 @@
{
lib,
fetchgit,
buildRacketPackage,
}: (buildRacketPackage {
pname = "br-parser-tools-lib";
version = "0.0+95b7c69";
dependencies = [];
src = fetchgit {
url = "https://gitlab.com/mbutterick/br-parser-tools.git";
rev = "95b7c69cf9d660a51abf4742378b9adb7100d25a";
hash = "sha256-and0y3rBjXwmgaEwwXzJOTgX/wCSY0uUfB3+U4JLTrk=";
};
gitSubpath = "br-parser-tools-lib";
passthru = {
racketModules = ["br-parser-tools/private-yacc/table.rkt" "br-parser-tools/private-lex/actions.rkt" "br-parser-tools/private-lex/stx.rkt" "br-parser-tools/private-yacc/yacc-helper.rkt" "br-parser-tools/private-lex/token-syntax.rkt" "br-parser-tools/examples/read.rkt" "br-parser-tools/yacc-to-scheme.rkt" "br-parser-tools/private-lex/token.rkt" "br-parser-tools/private-lex/unicode-chars.rkt" "br-parser-tools/private-yacc/input-file-parser.rkt" "br-parser-tools/private-lex/deriv.rkt" "br-parser-tools/lex.rkt" "br-parser-tools/private-yacc/lalr.rkt" "br-parser-tools/private-yacc/parser-builder.rkt" "br-parser-tools/private-yacc/graph.rkt" "br-parser-tools/private-yacc/lr0.rkt" "br-parser-tools/private-lex/error-tests.rkt" "br-parser-tools/cfg-parser.rkt" "br-parser-tools/private-lex/front.rkt" "br-parser-tools/yacc.rkt" "br-parser-tools/private-lex/re.rkt" "br-parser-tools/lex-sre.rkt" "br-parser-tools/private-yacc/parser-actions.rkt" "br-parser-tools/examples/calc.rkt" "br-parser-tools/lex-plt-v200.rkt" "br-parser-tools/private-yacc/grammar.rkt" "br-parser-tools/private-lex/util.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "fork of `parser-tools-lib` for Beautiful Racket";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."LGPL-3.0-or-later")];
};
})

View File

@ -1,27 +0,0 @@
{
lib,
br-parser-tools-lib,
fetchgit,
buildRacketPackage,
}: (buildRacketPackage {
pname = "brag-lib";
version = "0.0+30cbf95";
dependencies = [br-parser-tools-lib];
src = fetchgit {
url = "https://gitlab.com/mbutterick/brag.git";
rev = "30cbf95e6a717e71fb8bda6b15a7253aed36115a";
hash = "sha256-NJctskWDoBNRdBMDklALkMAPKT4A7on8pu6X3Q6NheE=";
};
gitSubpath = "brag-lib";
passthru = {
racketModules = ["brag/test/test-cutter.rkt" "brag/rules/rule-structs.rkt" "brag/examples/simple-line-drawing/examples/letter-i.rkt" "brag/test/test-hide-and-splice.rkt" "brag/test/test-simple-arithmetic-grammar.rkt" "brag/codegen/reader.rkt" "brag/codegen/codegen.rkt" "brag/examples/whitespace.rkt" "brag/examples/0n1.rkt" "brag/test/test-wordy.rkt" "brag/private/internal-support.rkt" "brag/examples/top-level-cut-3.rkt" "brag/examples/simple-line-drawing/lexer.rkt" "brag/examples/simple-arithmetic-grammar.rkt" "brag/test/test-parser.rkt" "brag/test/test-start-and-atok.rkt" "brag/examples/simple-line-drawing.rkt" "brag/rules/parser.rkt" "brag/examples/top-level-cut-2.rkt" "brag/main.rkt" "brag/test/test-0n1n.rkt" "brag/examples/simple-line-drawing/grammar.rkt" "brag/examples/wordy.rkt" "brag/rules/lexer.rkt" "brag/test/test-cutter-another.rkt" "brag/private/indenter.rkt" "brag/examples/cutter.rkt" "brag/examples/empty-symbol.rkt" "brag/examples/01-equal.rkt" "brag/test/test-baby-json.rkt" "brag/test/test-0n1.rkt" "brag/examples/baby-json-hider.rkt" "brag/test/test-empty-symbol.rkt" "brag/rules/stx-types.rkt" "brag/test/test-make-rule-parser.rkt" "brag/examples/simple-line-drawing/lang/reader.rkt" "brag/examples/top-level-cut-1.rkt" "brag/test/test-weird-grammar.rkt" "brag/test/test-whitespace.rkt" "brag/codegen/satisfaction.rkt" "brag/examples/nested-repeats.rkt" "brag/examples/simple-line-drawing/interpret.rkt" "brag/test/test-flatten.rkt" "brag/test/weird-grammar.rkt" "brag/test/test-all.rkt" "brag/examples/baby-json-alt2.rkt" "brag/examples/baby-json.rkt" "brag/test/test-01-equal.rkt" "brag/examples/statlist-grammar.rkt" "brag/examples/simple-line-drawing/semantics.rkt" "brag/examples/subrule.rkt" "brag/examples/lua-parser.rkt" "brag/test/test-quotation-marks-and-backslashes.rkt" "brag/test/test-lexer.rkt" "brag/test/test-nested-repeats.rkt" "brag/test/test-baby-json-hider.rkt" "brag/examples/start-and-atok.rkt" "brag/rules/stx.rkt" "brag/examples/add-mult.rkt" "brag/test/test-old-token.rkt" "brag/examples/cutter-another.rkt" "brag/test/test-top-level-cut.rkt" "brag/examples/bnf.rkt" "brag/codegen/runtime.rkt" "brag/test/test-codepoints.rkt" "brag/examples/codepoints.rkt" "brag/test/test-simple-line-drawing.rkt" "brag/test/test-errors.rkt" "brag/examples/hide-and-splice.rkt" "brag/examples/curly-quantifier.rkt" "brag/examples/nested-word-list.rkt" "brag/codegen/expander.rkt" "brag/examples/0n1n.rkt" "brag/private/colorer.rkt" "brag/codegen/flatten.rkt" "brag/examples/quotation-marks-and-backslashes.rkt" "brag/support.rkt" "brag/test/test-curly-quantifier.rkt" "brag/examples/baby-json-alt.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT")];
};
})

View File

@ -1,28 +0,0 @@
{
lib,
cldr-core,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "cldr-bcp47";
version = "0.0+823fc1a";
dependencies = [cldr-core];
src = fetchFromGitHub {
owner = "97jaz";
repo = "cldr-bcp47";
rev = "823fc1a530f1a0ec4de59f5454c1a17f20c5a5d6";
hash = "sha256-YY5q44IQ1cNX4wk8Yt7B+z2uvfy+xMSl5tTDs+1RBlA=";
};
gitSubpath = ".";
passthru = {
racketModules = ["cldr/bcp47/timezone.rkt" "cldr/bcp47/scribblings/cldr-bcp47-timezone.scrbl"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "API for BCP47 extensions to CLDR";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT") (((lib).licensesSpdx)."Unicode-TOU")];
};
})

View File

@ -1,28 +0,0 @@
{
lib,
buildRacketPackage,
fetchFromGitHub,
memoize-lib,
}: (buildRacketPackage {
pname = "cldr-core";
version = "0.0+c9b8077";
dependencies = [memoize-lib];
src = fetchFromGitHub {
owner = "97jaz";
repo = "cldr-core";
rev = "c9b80777c422c3b104bb85052d74a2dc1535a3c3";
hash = "sha256-Tpk6uYWz4//C+/n50wsLiD16rwOim85R/Ykrtcoa1+8=";
};
gitSubpath = ".";
passthru = {
racketModules = ["cldr/file.rkt" "cldr/likely-subtags.rkt" "cldr/core.rkt" "cldr/scribblings/cldr-core.scrbl"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "API for cldr-core data set";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT") (((lib).licensesSpdx)."Unicode-TOU")];
};
})

View File

@ -1,28 +0,0 @@
{
lib,
cldr-core,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "cldr-dates-modern";
version = "0.0+c362829";
dependencies = [cldr-core];
src = fetchFromGitHub {
owner = "97jaz";
repo = "cldr-dates-modern";
rev = "c36282917247f6a069e553535f4619007cd7b6e5";
hash = "sha256-byD2ubs543P9512lKD1JKB1ppyzjKzoWnuW8JPspa7M=";
};
gitSubpath = ".";
passthru = {
racketModules = ["cldr/dates-modern.rkt" "cldr/scribblings/cldr-dates-modern.scrbl"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "API for cldr-dates-modern data set";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT") (((lib).licensesSpdx)."Unicode-TOU")];
};
})

View File

@ -1,28 +0,0 @@
{
lib,
cldr-core,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "cldr-localenames-modern";
version = "0.0+f9f3e8d";
dependencies = [cldr-core];
src = fetchFromGitHub {
owner = "97jaz";
repo = "cldr-localenames-modern";
rev = "f9f3e8d9245764a309542816acf40fe147b473a3";
hash = "sha256-fZ1fnkslpZuicJgMh6/aLd4rPov7lvJr6ulDWpTMpKg=";
};
gitSubpath = ".";
passthru = {
racketModules = ["cldr/scribblings/cldr-localenames-modern.scrbl" "cldr/localenames-modern.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "API for cldr-localenames-modern data set";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT") (((lib).licensesSpdx)."Unicode-TOU")];
};
})

View File

@ -1,28 +0,0 @@
{
lib,
cldr-core,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "cldr-numbers-modern";
version = "0.0+6254280";
dependencies = [cldr-core];
src = fetchFromGitHub {
owner = "97jaz";
repo = "cldr-numbers-modern";
rev = "625428099b3f8cd264955a283dddc176a6080ba1";
hash = "sha256-RDa1d4sSyfyuNgz2dJdu2f1XGiO4cPOkaseZ7q2cLJU=";
};
gitSubpath = ".";
passthru = {
racketModules = ["cldr/scribblings/cldr-numbers-modern.scrbl" "cldr/numbers-modern.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "API for cldr-numbers-modern data set";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT") (((lib).licensesSpdx)."Unicode-TOU")];
};
})

View File

@ -1,27 +0,0 @@
{
lib,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "fancy-app";
version = "1.1+f451852";
dependencies = [];
src = fetchFromGitHub {
owner = "samth";
repo = "fancy-app";
rev = "f451852164ee67e3e122f25b4bce45001a557045";
hash = "sha256-2DdngIyocn+CrLf4A4yO9+XJQjIxzKVpmvGiNuM7mTQ=";
};
gitSubpath = ".";
passthru = {
racketModules = ["fancy-app/main.scrbl" "fancy-app/main.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "Scala-style anonymous functions";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT")];
};
})

View File

@ -1,29 +0,0 @@
{
lib,
buildRacketPackage,
pretty-expressive,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "fmt";
version = "0.0.3+002818e";
dependencies = [pretty-expressive];
src = fetchFromGitHub {
owner = "sorawee";
repo = "fmt";
rev = "002818ec08ad6e5e01f79e6209b69203581d6adc";
hash = "sha256-/oLq5WPpK/OO3ED7PBKTMCjDxTBy8+ZjoL/WPPC1zlU=";
};
gitSubpath = ".";
passthru = {
racketModules = ["fmt/tests/test-cases/let-cc-ec.rkt" "fmt/read.rkt" "fmt/tests/test-cases/test-dot.rkt" "fmt/realign.rkt" "fmt/tests/test-cases/test-deinprogramm.rkt" "fmt/tests/test-cases/define-contract.rkt" "fmt/scribblings/kws.rkt" "fmt/tests/test-cases/define-match.rkt" "fmt/tests/test-cases/general.rkt" "fmt/for-profiling.rkt" "fmt/tests/test-cases/send.rkt" "fmt/tests/benchmarks/class-internal.rkt" "fmt/params.rkt" "fmt/tests/test-cases/test-quasisyntax.rkt" "fmt/tests/test-cases/large2.rkt" "fmt/tests/permission-test.rkt" "fmt/.fmt.rkt" "fmt/tests/test-cases/cr.rkt" "fmt/tests/test-cases/test-asl.rkt" "fmt/private/memoize.rkt" "fmt/tests/benchmarks/xform.rkt" "fmt/tests/test-cases/test-if.rkt" "fmt/version.rkt" "fmt/core.rkt" "fmt/tests/benchmarks/list.rkt" "fmt/tokenize.rkt" "fmt/raco.rkt" "fmt/conventions.rkt" "fmt/tests/test-cases/large.rkt" "fmt/tests/config-tests/file.rkt" "fmt/tests/test-cases/rackunit.rkt" "fmt/tests/benchmarks/hash.rkt" "fmt/tests/test-cases/test-hash-bang.rkt" "fmt/tests/test-cases/test-herestring.rkt" "fmt/tests/config-tests/config.rkt" "fmt/scribblings/fmt.scrbl" "fmt/record.rkt" "fmt/tests/test-cases/test-class.rkt" "fmt/common.rkt" "fmt/tests/test-cases/let-values.rkt" "fmt/tests/test-cases/test-lambda.rkt" "fmt/scribblings/examples/example.rkt" "fmt/tests/test-cases/delay.rkt" "fmt/main.rkt" "fmt/scribblings/util.rkt" "fmt/regen.rkt"];
racketLaunchers = [];
racoCommands = ["fmt"];
};
meta = {
description = "An extensible code formatter for Racket";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."Apache-2.0") (((lib).licensesSpdx)."MIT")];
homepage = "https://github.com/sorawee/fmt/tree/HEAD/README.md";
};
})

View File

@ -1,35 +0,0 @@
{
fetchFromGitHub,
cldr-localenames-modern,
memoize-lib,
cldr-bcp47,
cldr-dates-modern,
lib,
cldr-core,
buildRacketPackage,
cldr-numbers-modern,
tzinfo,
}: (buildRacketPackage {
pname = "gregor-lib";
version = "0.0+f56215d";
dependencies = [memoize-lib tzinfo cldr-core cldr-bcp47 cldr-numbers-modern cldr-dates-modern cldr-localenames-modern];
src = fetchFromGitHub {
owner = "97jaz";
repo = "gregor";
rev = "f56215db229ef2e33670f55d08c0330d8f85de23";
hash = "sha256-4TIeinXk7ak7sbT2lwfWYdwIwFD9S7whBrR2KEajW30=";
};
gitSubpath = "gregor-lib";
passthru = {
racketModules = ["gregor/private/pattern/l10n/numbers.rkt" "gregor/private/period.rkt" "gregor/private/pattern/l10n/zone-util.rkt" "gregor/private/pattern/ast/era.rkt" "gregor/private/pattern/ast/second.rkt" "gregor/private/pattern/l10n/gmt-offset.rkt" "gregor/private/pattern/ast/hour.rkt" "gregor/private/pattern/ast/minute.rkt" "gregor/private/pattern/l10n/named-trie.rkt" "gregor/private/core/compare.rkt" "gregor/private/pattern/ast/week.rkt" "gregor/private/iso8601-parse.rkt" "gregor/main.rkt" "gregor/private/pattern/l10n/l10n-week.rkt" "gregor/private/pattern/l10n/iso-offset.rkt" "gregor/private/pattern/l10n/trie.rkt" "gregor/private/exn.rkt" "gregor/private/pattern/l10n/zone-id.rkt" "gregor/time.rkt" "gregor/private/pattern/lexer.rkt" "gregor/private/clock.rkt" "gregor/private/pattern/l10n/zone-loc.rkt" "gregor/private/pattern/ast.rkt" "gregor/private/pattern/l10n/symbols.rkt" "gregor/private/pattern/ast/year.rkt" "gregor/private/pattern/ast/zone.rkt" "gregor/private/pattern/parse-state.rkt" "gregor/private/pattern/ast/literal.rkt" "gregor/private/difference.rkt" "gregor/private/core/math.rkt" "gregor/private/pattern/ast/month.rkt" "gregor/private/time.rkt" "gregor/private/pattern/ast/period.rkt" "gregor/private/pattern/ast/weekday.rkt" "gregor/private/datetime.rkt" "gregor/private/moment-base.rkt" "gregor/private/pattern/ast/day.rkt" "gregor/private/parse.rkt" "gregor/private/date.rkt" "gregor/private/moment.rkt" "gregor/private/format.rkt" "gregor/private/pattern/l10n/zone-nonloc.rkt" "gregor/private/generics.rkt" "gregor/period.rkt" "gregor/private/pattern/ast/separator.rkt" "gregor/private/core/structs.rkt" "gregor/private/core/hmsn.rkt" "gregor/private/pattern/l10n/metazone.rkt" "gregor/private/pattern/ast/quarter.rkt" "gregor/private/core/ymd.rkt" "gregor/private/offset-resolvers.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "Code part of the gregor date and time library";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT")];
homepage = "https://github.com/97jaz/gregor/tree/HEAD/README.md";
};
})

View File

@ -1,28 +0,0 @@
{
lib,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "guard";
version = "0.0+de93f4b";
dependencies = [];
src = fetchFromGitHub {
owner = "jackfirth";
repo = "guard";
rev = "de93f4b5f38f1086177a09a40583af2932759b75";
hash = "sha256-z5sUidOIadtOZqVRBPxjIAz/D71U9XiE06EE+DGZzBg=";
};
gitSubpath = ".";
passthru = {
racketModules = ["guard/private/scribble-evaluator-factory.rkt" "guard/scribblings/guard.scrbl" "guard/main.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "Macros similar to Swift's \"guard statements\".";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."Apache-2.0") (((lib).licensesSpdx)."MIT")];
homepage = "https://github.com/jackfirth/guard/tree/HEAD/README.md";
};
})

View File

@ -1,28 +0,0 @@
{
lib,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "memoize-lib";
version = "3.0+f373706";
dependencies = [];
src = fetchFromGitHub {
owner = "jbclements";
repo = "memoize";
rev = "f373706824145ce2a8247edb76278d6df139333c";
hash = "sha256-87a5nSpOZaal1/t5GMk5yFHX1daukabYQ/1J4L5LN4o=";
};
gitSubpath = "memoize-lib";
passthru = {
racketModules = ["memoize/main.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "core library for memoize";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT")];
homepage = "https://github.com/jbclements/memoize/tree/master/README.md";
};
})

View File

@ -1,29 +0,0 @@
{
gregor-lib,
buildRacketPackage,
lib,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "north";
version = "0.8+00e5221";
dependencies = [gregor-lib];
src = fetchFromGitHub {
owner = "Bogdanp";
repo = "racket-north";
rev = "00e52217081d421bcdd1c2248e309e0d92dd5314";
hash = "sha256-oSjrLNsQ53vUIFRF2spie7o/NSrlF29Dqw2et9Isf3o=";
};
gitSubpath = "north";
passthru = {
racketModules = ["north/main.rkt" "north/north.scrbl" "north/tool/syntax-color.rkt" "north/adapter/sqlite.rkt" "north/adapter/base.rkt" "north/migrate.rkt" "north/adapter/postgres.rkt" "north/lang/reader.rkt" "north/base.rkt" "north/cli.rkt"];
racketLaunchers = [];
racoCommands = ["north"];
};
meta = {
description = "A database migration tool.";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."BSD-3-Clause")];
homepage = "https://github.com/Bogdanp/racket-north/tree/HEAD/README.md";
};
})

View File

@ -1,28 +0,0 @@
{
lib,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "pretty-expressive";
version = "1.1+0984931";
dependencies = [];
src = fetchFromGitHub {
owner = "sorawee";
repo = "pretty-expressive";
rev = "0984931c6f8ff32921dd477c875127de7600dfd5";
hash = "sha256-5WokTHS90pYo5ltJEWX5MIMyUWr2AlRU/W2bznLQ74U=";
};
gitSubpath = ".";
passthru = {
racketModules = ["pretty-expressive/benchmarks/json.rkt" "pretty-expressive/core.rkt" "pretty-expressive/benchmarks/sexp-random.rkt" "pretty-expressive/benchmarks/sexp-full.rkt" "pretty-expressive/benchmarks/concat.rkt" "pretty-expressive/doc.rkt" "pretty-expressive/addons.rkt" "pretty-expressive/scribblings/pretty-expressive.scrbl" "pretty-expressive/benchmarks/fill-sep.rkt" "pretty-expressive/main.rkt" "pretty-expressive/benchtool.rkt" "pretty-expressive/benchmarks/flatten.rkt" "pretty-expressive/benchmarks/wadler-opt.rkt" "pretty-expressive/examples.rkt" "pretty-expressive/process.rkt" "pretty-expressive/promise.rkt"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "A pretty expressive printer";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."Apache-2.0") (((lib).licensesSpdx)."MIT")];
homepage = "https://github.com/sorawee/pretty-expressive/tree/main/README.md";
};
})

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -1,29 +0,0 @@
{
lib,
cldr-core,
buildRacketPackage,
fetchFromGitHub,
}: (buildRacketPackage {
pname = "tzinfo";
version = "0.6+2f81228";
dependencies = [cldr-core];
src = fetchFromGitHub {
owner = "97jaz";
repo = "tzinfo";
rev = "2f812283d9c90040aecb3c7e2ed2edf93a3720de";
hash = "sha256-vvb3EZHFysa/2OiTat+i8zuALxiCPHNNaWCGlyPF6gk=";
};
gitSubpath = ".";
passthru = {
racketModules = ["tzinfo/zoneinfo.rkt" "tzinfo/private/os/env.rkt" "tzinfo/private/os/unix.rkt" "tzinfo/private/tabfile-parser.rkt" "tzinfo/main.rkt" "tzinfo/source.rkt" "tzinfo/private/zoneinfo-search.rkt" "tzinfo/private/tzfile-parser.rkt" "tzinfo/private/os/windows.rkt" "tzinfo/private/zoneinfo.rkt" "tzinfo/private/structs.rkt" "tzinfo/test/zoneinfo.rkt" "tzinfo/private/os/windows-registry.rkt" "tzinfo/private/generics.rkt" "tzinfo/scribblings/tzinfo.scrbl"];
racketLaunchers = [];
racoCommands = [];
};
meta = {
description = "API for querying the IANA tz database";
sourceProvenance = [(((lib).sourceTypes).fromSource)];
broken = false;
license = [(((lib).licensesSpdx)."MIT")];
homepage = "https://github.com/97jaz/tzinfo/tree/HEAD/README.md";
};
})

View File

@ -1,7 +0,0 @@
{
racket,
makeSetupHook,
}: makeSetupHook {
name = "racket-install-hook";
propagatedBuildInputs = [ racket ];
} ./racket-install-hook.sh

View File

@ -1,159 +0,0 @@
echo "Sourcing racket-install-hook"
addRacketPath() {
if [ -f "$1/nix-support/racket-pkg" ]; then
addToSearchPathWithCustomDelimiter : NIX_RACKET_PKG_PATH $1
fi
}
racketInstallPhase() {
echo "Executing racketInstallPhase"
cd "$racketGitSubpath"
runHook preInstall
mkdir -p $out/{include,etc/racket,lib/racket,share/racket/pkgs,share/racket/collects,bin,share/applications,share/doc/racket,share/man}
mkdir -p $out/nix-support
touch $out/nix-support/racket-pkg
out="$out" tethered="$racketTetheredInstallation" \
racket --no-user-path -nl racket/base -f - <<EOF
(require racket/function racket/hash racket/list racket/pretty racket/string racket/match)
(define out (getenv "out"))
(define pkgs (path-list-string->path-list (or (getenv "NIX_RACKET_PKG_PATH") "") '()))
(define tethered? (equal? (getenv "tethered") "1"))
(define base-config (read-installation-configuration-table))
(define (add-to-search added-list search-list)
(match search-list
['() (error "no #f found in search list!")]
[(cons #f rst) (cons #f (append added-list rst))]
[(cons fst rst) (cons fst (add-to-search added-list rst))]))
(define (make-search-path* key list-key [pkgs-search '()])
(define old-search-list (hash-ref base-config list-key '(#f)))
(define old-value
(cond
[(hash-has-key? base-config key)
(list (hash-ref base-config key))]
[(eq? key 'links-file)
(list
(path->string
(build-path (hash-ref base-config 'share-dir) "links.rktd")))]
[else (error "no key" key)]))
(define added-list (append pkgs-search old-value))
(add-to-search added-list old-search-list))
(define (default-location pkg key)
(path->string
(match key
['include-dir (build-path pkg "include")]
['lib-dir (build-path pkg "lib/racket")]
['share-dir (build-path pkg "share/racket")]
['pkgs-dir (build-path pkg "share/racket/pkgs")]
['links-file (build-path pkg "share/racket/links.rktd")]
['bin-dir (build-path pkg "bin")]
['doc-dir (build-path pkg "share/doc/racket")]
['man-dir (build-path pkg "share/man")]
[_ (error "unexpected key:" key)])))
(define (make-search-path key list-key)
(define pkgs-search
(for/list ([pkg (in-list pkgs)])
(default-location pkg key)))
(make-search-path* key list-key pkgs-search))
(define (add-libs lib-path)
(define ldflags (string-split (getenv "NIX_LDFLAGS")))
(define libs
(for/list ([lib (in-list ldflags)] #:when (string-prefix? "-L" lib))
(string-trim "-L" #:right? #f)))
(remove-duplicates (append libs lib-path)))
(define config*
(hash
'absolute-installation? #t
'build-stamp ""
'catalogs (hash-ref base-config 'catalogs)
'compiled-file-roots (hash-ref base-config 'compiled-file-roots)
'apps-dir (path->string (build-path out "share/applications"))
'bin-dir (default-location out 'bin-dir)
'bin-search-dirs (make-search-path 'bin-dir 'bin-search-dirs)
'doc-dir (default-location out 'doc-dir)
'doc-search-dirs (make-search-path 'doc-dir 'doc-search-dirs)
'doc-search-url (hash-ref base-config 'doc-search-url)
'include-dir (default-location out 'include-dir)
'include-search-dirs (make-search-path 'include-dir 'include-search-dirs)
'lib-dir (default-location out 'lib-dir)
'lib-search-dirs (add-libs (make-search-path 'lib-dir 'lib-search-dirs))
'links-file (default-location out 'links-file)
'links-search-files (make-search-path 'links-file 'links-search-files)
'man-dir (default-location out 'man-dir)
'man-search-dirs (make-search-path 'man-dir 'man-search-dirs)
'pkgs-dir (default-location out 'pkgs-dir)
'pkgs-search-dirs (make-search-path 'pkgs-dir 'pkgs-search-dirs)
'share-dir (default-location out 'share-dir)
'share-search-dirs (make-search-path 'share-dir 'share-search-dirs)))
(define config
(if tethered?
(hash-union
config*
(hash
'config-tethered-console-bin-dir (hash-ref config* 'bin-dir)
'config-tethered-gui-bin-dir (hash-ref config* 'bin-dir)
'config-tethered-apps-dir (hash-ref config* 'apps-dir)))
config*))
(with-output-to-file (build-path out "etc/racket/config.rktd")
(curry pretty-write config))
EOF
echo Initializing installation layer
if [ "$racketTetheredInstallation" == "1" ]; then
racket --config $out/etc/racket/ --no-user-path -l- \
raco setup --no-zo
elif [ "$racketDoMainSetup" == "1" ]; then
racket --config $out/etc/racket/ --no-user-path -l- \
raco setup --no-zo --no-launcher
rm $out/bin/mzscheme # ????
fi
if [ "$racketEnvOnly" == "1" ]; then
echo Skipping raco pkg install
else
echo Running raco pkg install
racoflags=""
if [ "$racketBuildDocs" != "1" ]; then
racoflags="--no-docs"
fi
racket --config $out/etc/racket/ --no-user-path -l- \
raco pkg install --installation --deps fail --copy --name "$pname" $racoflags \
"$(readlink -e .)"
fi
runHook postInstall
echo "Finished executing racketInstallPhase"
}
if [ -z "${dontUseRacketInstall-}" ] && [ -z "${installPhase-}" ]; then
echo "Adding racket env hook"
addEnvHooks "$targetOffset" addRacketPath
echo "Using racketInstallPhase"
installPhase=racketInstallPhase
fi

View File

@ -1,27 +0,0 @@
#lang racket/base
(require
racket/function
racket/list
racket/pretty
racket/string
setup/dirs
)
(define config-file (build-path (find-config-dir) "config.rktd"))
(define lib-paths
((compose remove-duplicates
(curry map (curryr string-trim "-L" #:right? #f))
(curry filter (curryr string-prefix? "-L"))
string-split)
(getenv "NIX_LDFLAGS")))
(define config
(let* ([prev-config (read-installation-configuration-table)]
[prev-lib-search-dirs (hash-ref prev-config 'lib-search-dirs '(#f))]
[lib-search-dirs (remove-duplicates (append lib-paths prev-lib-search-dirs))])
(hash-set prev-config 'lib-search-dirs lib-search-dirs)))
(call-with-output-file config-file
#:exists 'replace
(curry pretty-write config))

View File

@ -1,11 +0,0 @@
{
"version": "8.18",
"full": {
"filename": "racket-8.18-src.tgz",
"sha256": "65477c71ec1a978a6ee4db582b9b47b1a488029d7a42e358906de154a6e5905c"
},
"minimal": {
"filename": "racket-minimal-8.18-src.tgz",
"sha256": "24b9cf8365254b43bac308192c782edfbd86363df1322c4e063b797ed0f7db66"
}
}

View File

@ -1,175 +0,0 @@
{
lib,
stdenv,
fetchurl,
libiconvReal,
libz,
lz4,
ncurses,
openssl,
sqlite,
disableDocs ? false,
callPackage,
writers,
}:
let
manifest = lib.importJSON ./manifest.json;
inherit (stdenv.hostPlatform) isDarwin;
in
stdenv.mkDerivation (finalAttrs: {
pname = "racket";
inherit (manifest) version;
src = fetchurl {
url = "https://mirror.racket-lang.org/installers/${manifest.version}/${manifest.minimal.filename}";
inherit (manifest.minimal) sha256;
};
buildInputs = [
libiconvReal
libz
lz4
ncurses
openssl
sqlite.out
];
patches = lib.optionals isDarwin [
/*
The entry point binary $out/bin/racket is codesigned at least once. The
following error is triggered as a result.
(error 'add-ad-hoc-signature "file already has a signature")
We always remove the existing signature then call add-ad-hoc-signature to
circumvent this error.
*/
./patches/force-remove-codesign-then-add.patch
];
preConfigure =
/*
The configure script forces using `libtool -o` as AR on Darwin. But, the
`-o` option is only available from Apple libtool. GNU ar works here.
*/
lib.optionalString isDarwin ''
substituteInPlace src/ChezScheme/zlib/configure \
--replace-fail 'ARFLAGS="-o"' 'AR=ar; ARFLAGS="rc"'
''
+ ''
mkdir src/build
cd src/build
'';
configureScript = "../configure";
configureFlags = [
# > docs failure: ftype-ref: ftype mismatch for #<ftype-pointer>
# "--enable-check"
"--enable-csonly"
"--enable-liblz4"
"--enable-libz"
]
++ lib.optional disableDocs "--disable-docs"
++ lib.optionals (!(finalAttrs.dontDisableStatic or false)) [
# instead of `--disable-static` that `stdenv` assumes
"--disable-libs"
# "not currently supported" in `configure --help-cs` but still emphasized in README
"--enable-shared"
]
++ lib.optionals isDarwin [
"--disable-strip"
# "use Unix style (e.g., use Gtk) for Mac OS", which eliminates many problems
"--enable-xonx"
];
# The upstream script builds static libraries by default.
dontAddStaticConfigureFlags = true;
dontStrip = isDarwin;
postFixup =
let
configureInstallation = builtins.path {
name = "configure-installation.rkt";
path = ./configure-installation.rkt;
};
in
''
$out/bin/racket -U -u ${configureInstallation}
'';
passthru = {
# Functionalities #
updateScript = {
command = ./update.py;
attrPath = "racket";
supportedFeatures = [ "commit" ];
};
writeScript =
nameOrPath:
{
libraries ? [ ],
...
}@config:
assert lib.assertMsg (libraries == [ ]) "library integration for Racket has not been implemented";
writers.makeScriptWriter (
builtins.removeAttrs config [ "libraries" ]
// {
interpreter = "${lib.getExe finalAttrs.finalPackage}";
}
) nameOrPath;
writeScriptBin = name: finalAttrs.passthru.writeScript "/bin/${name}";
# Tests #
tests = builtins.mapAttrs (name: path: callPackage path { racket = finalAttrs.finalPackage; }) {
## Basic ##
write-greeting = ./tests/write-greeting.nix;
get-version-and-variant = ./tests/get-version-and-variant.nix;
load-openssl = ./tests/load-openssl.nix;
## Nixpkgs supports ##
nix-write-script = ./tests/nix-write-script.nix;
};
};
meta = {
description = "Programmable programming language (minimal distribution)";
longDescription = ''
Racket is a full-spectrum programming language. It goes beyond
Lisp and Scheme with dialects that support objects, types,
laziness, and more. Racket enables programmers to link
components written in different dialects, and it empowers
programmers to create new, project-specific dialects. Racket's
libraries support applications from web servers and databases to
GUIs and charts.
This minimal distribution includes just enough of Racket that you can
use `raco pkg` to install more.
'';
homepage = "https://racket-lang.org/";
changelog = "https://github.com/racket/racket/releases/tag/v${finalAttrs.version}";
/*
> Racket is distributed under the MIT license and the Apache version 2.0
> license, at your option.
> The Racket runtime system embeds Chez Scheme, which is distributed
> under the Apache version 2.0 license.
*/
license = with lib.licenses; [
asl20
mit
];
sourceProvenance = with lib.sourceTypes; [
fromSource
binaryBytecode
];
maintainers = with lib.maintainers; [ rc-zb ];
mainProgram = "racket";
platforms = lib.platforms.all;
};
})

View File

@ -1,140 +0,0 @@
{
lib,
stdenv,
fetchurl,
racket-minimal,
cairo,
fontconfig,
glib,
glibcLocales,
gtk3,
libGL,
libiodbc,
libjpeg,
libpng,
makeFontsConf,
pango,
unixODBC,
wrapGAppsHook3,
disableDocs ? false,
callPackage,
}:
let
minimal = racket-minimal.override { inherit disableDocs; };
manifest = lib.importJSON ./manifest.json;
inherit (stdenv.hostPlatform) isDarwin;
in
minimal.overrideAttrs (
finalAttrs: prevAttrs: {
src = fetchurl {
url = "https://mirror.racket-lang.org/installers/${manifest.version}/${manifest.full.filename}";
inherit (manifest.full) sha256;
};
buildInputs = prevAttrs.buildInputs ++ [
(if isDarwin then libiodbc else unixODBC)
cairo
fontconfig.lib
glib
gtk3
libGL
libjpeg
libpng
pango
];
nativeBuildInputs = [
wrapGAppsHook3
];
patches = prevAttrs.patches or [ ] ++ [
/*
Hardcode variant detection because nixpkgs wraps the Racket binary making it
fail to detect its variant at runtime.
https://github.com/NixOS/nixpkgs/issues/114993#issuecomment-812951247
*/
./patches/force-cs-variant.patch
];
preBuild =
let
libPathsVar = if isDarwin then "DYLD_FALLBACK_LIBRARY_PATH" else "LD_LIBRARY_PATH";
in
/*
Makes FFIs available for setting up `main-distribution` and its
dependencies, which is integrated into the build process of Racket
*/
''
for lib_path in $( \
echo "$NIX_LDFLAGS" \
| tr ' ' '\n' \
| grep '^-L' \
| sed 's/^-L//' \
| awk '!seen[$0]++' \
); do
addToSearchPath ${libPathsVar} $lib_path
done
''
# Fixes Fontconfig errors
+ ''
export FONTCONFIG_FILE=${makeFontsConf { fontDirectories = [ ]; }}
export XDG_CACHE_HOME=$(mktemp -d)
'';
# Disable automatic wrapping, and only wrap the ELF binaries:
#
# - bin/racket
# - lib/racket/gracket
# - bin/mred
# - bin/mzscheme
#
# This avoids effectively double-wrapping shell scripts generated by raco, because they will
# call into the wrapped ELF binaries
dontWrapGApps = true;
preFixup = (lib.optionalString (!isDarwin) ''
gappsWrapperArgs+=("--set" "LOCALE_ARCHIVE" "${glibcLocales}/lib/locale/locale-archive")
'') + ''
wrapProgram $out/bin/racket "''${gappsWrapperArgs[@]}"
wrapProgram $out/bin/mred "''${gappsWrapperArgs[@]}"
wrapProgram $out/bin/mzscheme "''${gappsWrapperArgs[@]}"
wrapProgram $out/lib/racket/gracket "''${gappsWrapperArgs[@]}"
'';
passthru =
let
notUpdated = x: !builtins.isAttrs x || lib.isDerivation x;
stopPred =
_: lhs: rhs:
notUpdated lhs || notUpdated rhs;
in
lib.recursiveUpdateUntil stopPred prevAttrs.passthru {
tests = builtins.mapAttrs (name: path: callPackage path { racket = finalAttrs.finalPackage; }) {
## `main-distribution` ##
draw-crossing = ./tests/draw-crossing.nix;
};
};
meta = prevAttrs.meta // {
description = "Programmable programming language";
longDescription = ''
Racket is a full-spectrum programming language. It goes beyond
Lisp and Scheme with dialects that support objects, types,
laziness, and more. Racket enables programmers to link
components written in different dialects, and it empowers
programmers to create new, project-specific dialects. Racket's
libraries support applications from web servers and databases to
GUIs and charts.
'';
platforms = lib.platforms.unix;
badPlatforms = lib.platforms.darwin;
};
}
)

View File

@ -1,12 +0,0 @@
--- old/collects/setup/variant.rkt
+++ new/collects/setup/variant.rkt
@@ -7,7 +7,8 @@
(provide variant-suffix
script-variant?)
-(define plain-variant
+(define plain-variant 'cs)
+#;(define plain-variant
(delay/sync
(cond
[(cross-installation?)

View File

@ -1,10 +0,0 @@
--- old/src/mac/codesign.rkt
+++ new/src/mac/codesign.rkt
@@ -18,6 +18,6 @@
file))
(void
- (if remove?
+ (begin
(remove-signature file)
(add-ad-hoc-signature file)))

View File

@ -1,18 +0,0 @@
{ runCommandLocal, racket }:
runCommandLocal "racket-test-draw-crossing"
{
nativeBuildInputs = [ racket ];
}
''
racket -f - <<EOF
(require racket/draw)
(define target (make-bitmap 64 64))
(define dc (new bitmap-dc% [bitmap target]))
(send dc draw-line 0 0 64 64)
(send dc draw-line 0 64 64 0)
(send target save-file (getenv "out") 'png)
EOF
''

View File

@ -1,45 +0,0 @@
{
lib,
runCommandLocal,
racket,
}:
runCommandLocal "racket-test-get-version-and-variant"
{
nativeBuildInputs = [ racket ];
}
(
lib.concatLines (
builtins.map
(
{ expectation, output }:
''
expectation="${expectation}"
output="${output}"
if test "$output" != "$expectation"; then
echo "output mismatch: expected ''${expectation}, but got ''${output}"
exit 1
fi
''
)
[
{
expectation = racket.version;
output = "$(racket -e '(display (version))')";
}
{
expectation = "cs";
output = "$(racket -e '(require launcher/launcher) (display (current-launcher-variant))')";
}
{
expectation = "${lib.getExe racket}";
output = "$(racket -e '(require compiler/find-exe) (display (find-exe))')";
}
]
)
+ ''
touch $out
''
)

View File

@ -1,15 +0,0 @@
{ runCommandLocal, racket }:
runCommandLocal "racket-test-load-openssl"
{
nativeBuildInputs = [ racket ];
}
''
racket -f - <<EOF
(require openssl)
(unless ssl-available?
(raise ssl-load-fail-reason))
EOF
touch $out
''

View File

@ -1,26 +0,0 @@
{ runCommandLocal, racket }:
let
script = racket.writeScript "racket-test-nix-write-script-the-script" { } ''
#lang racket/base
(display "success")
(newline)
'';
in
runCommandLocal "racket-test-nix-write-script"
{
nativeBuildInputs = [ racket ];
}
''
expectation="success"
output="$(${script})"
if test "$output" != "$expectation"; then
echo "output mismatch: expected ''${expectation}, but got ''${output}"
exit 1
fi
touch $out
''

View File

@ -1,23 +0,0 @@
{ runCommandLocal, racket }:
runCommandLocal "racket-test-write-greeting"
{
nativeBuildInputs = [ racket ];
}
''
expectation="Hello, world!"
racket -f - <<EOF
(with-output-to-file (getenv "out")
(lambda ()
(display "Hello, world!")
(newline)))
EOF
output="$(cat $out)"
if test "$output" != "$expectation"; then
echo "output mismatch: expected ''${expectation}, but got ''${output}"
exit 1
fi
''

Some files were not shown because too many files have changed in this diff Show More