Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: Remove ImJoy API and cleanup notebook with preferred anywidget API #186

Merged
merged 7 commits into from
Jul 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,4 @@ __pycache__
.venv
.ipynb_checkpoints
dist/
astronaut.zarr
90 changes: 60 additions & 30 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,49 +1,79 @@
<h1>
<p align="center">
<img width="400" src="./assets/logo-wide.png" alt="Vizarr">
<img src="./assets/logo-wide.svg" alt="vizarr" width="200">
</h1>
<samp>
<p align="center">
<span>view multiscale zarr images online and in notebooks</span>
<br>
<br>
<a href="https://hms-dbmi.github.io/vizarr/?source=https://minio-dev.openmicroscopy.org/idr/v0.3/idr0062-blin-nuclearsegmentation/6001240.zarr">app</a> .
<a href="./python/notebooks/getting_started.ipynb">getting started</a>
</p>
</samp>
</p>

[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/hms-dbmi/vizarr/main?filepath=example%2Fgetting_started.ipynb)
[![launch ImJoy](https://imjoy.io/static/badge/launch-imjoy-badge.svg)](https://imjoy.io/lite?plugin=https://github.com/hms-dbmi/vizarr/blob/main/example/VizarrDemo.imjoy.html)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/hms-dbmi/vizarr/blob/main/example/mandelbrot.ipynb)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/hms-dbmi/vizarr/blob/main/python/notebooks/mandelbrot.ipynb)

![Multiscale OME-Zarr in Jupyter Notebook with Vizarr](./assets/screenshot.png)
<center>
<img src="./assets/screenshot.png" alt="Multiscale OME-Zarr in Jupyter Notebook with Vizarr" width="400">
</center>

Vizarr is a minimal, purely client-side program for viewing Zarr-based images. It is built with
[Viv](https://github.com/hms-dbmi/viv) and exposes a Python API using the
[`imjoy-rpc`](https://github.com/imjoy-team/imjoy-rpc), allowing users to programatically view multiplex
and multiscale images from within a Jupyter Notebook. The ImJoy plugin registers a codec for Python
`zarr.Array` and `zarr.Group` objects, enabling Viv to securely request chunks lazily via
[Zarr.js](https://github.com/gzuidhof/zarr.js/). This means that other valid zarr-python
[stores](https://zarr.readthedocs.io/en/stable/api/storage.html) can be viewed remotely with Viv,
enabling flexible workflows when working with large datasets.
**Vizarr** is a minimal, purely client-side program for viewing zarr-based images.

### Remote image registration workflow
We created Vizarr to enhance interactive multimodal image alignment using the
[wsireg](https://github.com/NHPatterson/wsireg) library. We describe a rapid workflow where
comparison of registration methods as well as visual verification of alignnment can be assessed
remotely, leveraging high-performance computational resources for rapid image processing and
Viv for interactive web-based visualization in a laptop computer. The Jupyter Notebook containing
the workflow described in the manuscript can be found in [`multimodal_registration_vizarr.ipynb`](multimodal_registration_vizarr.ipynb). For more information, please read our preprint [doi:10.31219/osf.io/wd2gu](https://doi.org/10.31219/osf.io/wd2gu).

> Note: The data required to run this notebook is too large to include in this repository and can be made avaiable upon request.
- ⚡ **GPU accelerated rendering** with [Viv](https://github.com/hms-dbmi/viv)
- 💻 Purely **client-side** zarr access with [zarrita.js](https://github.com/manzt/zarrita.js)
- 🌎 A **standalone [web app](https://hms-dbmi/vizarr)** for viewing entirely in the browser.
- 🐍 An [anywidget](https://github.com/manzt/anywidget) **Python API** for
programmatic control in notebooks.
- 📦 Supports any `zarr-python` [store](https://zarr.readthedocs.io/en/stable/api/storage.html)
as a backend.

### Data types
Vizarr supports viewing 2D slices of n-Dimensional Zarr arrays, allowing users to choose
a single channel or blended composites of multiple channels during analysis. It has special support
for the developing [OME-Zarr format](https://github.com/ome/omero-ms-zarr/blob/master/spec.md)
for multiscale and multimodal images. Currently [Viv](https://github.com/hms-dbmi/viv) supports
`i1`, `i2`, `i4`, `u1`, `u2`, `u4`, and `f4` arrays, but contributions are welcome to support more `np.dtypes`!

### Getting started
The easiest way to get started with `vizarr` is to clone this repository and open one of
the example [Jupyter Notebooks](example/).
**Vizarr** supports viewing 2D slices of n-Dimensional Zarr arrays, allowing
users to choose a single channel or blended composites of multiple channels
during analysis. It has special support for the developing OME-NGFF format for
multiscale and multimodal images. Currently, Viv supports `int8`, `int16`,
`int32`, `uint8`, `uint16`, `uint32`, `float32`, `float64` arrays, but
contributions are welcome to support more np.dtypes!

### Getting started

Copy and paste a URL to a Zarr store as the `?source` query parameter in the
**[web app](https://hms-dbmi.github.io/vizarr/)**. For example, to view the
[example data](https://minio-dev.openmicroscopy.org/idr/v0.3/idr0062-blin-nuclearsegmentation/6001240.zarr)
from the IDR, you can use the following URL:

```
https://hms-dbmi.github.io/vizarr/?source=https://minio-dev.openmicroscopy.org/idr/v0.3/idr0062-blin-nuclearsegmentation/6001240.zarr
```

Otherwise you can try out the Python API in a Jupyter Notebook, following [the
examples](./python/notebooks/getting_started.ipynb).

```sh
pip install vizarr
```

```python
import vizarr
import zarr

store = zarr.open("./path/to/ome.zarr")
viewer = vizarr.Viewer()
viewer.add_image(store)
viewer
```

### Limitations

`vizarr` was built to support the registration use case above where multiple, pyramidal OME-Zarr images
are viewed within a Jupyter Notebook. Support for other Zarr arrays is supported but not as well tested.
More information regarding the viewing of generic Zarr arrays can be found in the example notebooks.

### Citation

If you are using Vizarr in your research, please cite our paper:

> Trevor Manz, Ilan Gold, Nathan Heath Patterson, Chuck McCallum, Mark S Keller, Bruce W Herr II, Katy Börner, Jeffrey M Spraggins, Nils Gehlenborg,
Expand Down
17 changes: 0 additions & 17 deletions binder/environment.yml

This file was deleted.

40 changes: 0 additions & 40 deletions example/README.md

This file was deleted.

44 changes: 0 additions & 44 deletions example/VizarrDemo.imjoy.html

This file was deleted.

Loading
Loading