Skip to content

Conversation

@hongquanli
Copy link
Collaborator

Summary

  • Add flatfield correction module using BaSiCPy for illumination correction
  • Two modes: Calculate from tiles or Load from .npy file
  • Per-channel flatfield and optional darkfield correction
  • Constant darkfield per channel (physically accurate for uniform dark current)

Features

  • Calculate: Sample ~50 tiles and compute flatfield using BaSiCPy
  • Load: Drag-and-drop .npy flatfield files
  • Auto-save: Calculated flatfield saved as {input}_flatfield.npy
  • Auto-load: Existing flatfield loaded when input file is dropped
  • View: Matplotlib visualization with colorbars (flatfield + darkfield)
  • Clear: Reset flatfield state

UI Changes

  • New "Flatfield Correction" section between Preview and Settings
  • Enable/disable checkbox (default: enabled)
  • Radio buttons for Calculate vs Load mode
  • Include darkfield correction option
  • Status label showing channel count
  • View and Clear buttons

Test plan

  • Drop OME-TIFF file and verify auto-load of existing flatfield
  • Calculate flatfield with darkfield enabled
  • View flatfield/darkfield in matplotlib (check constant darkfield values)
  • Run stitching with flatfield correction enabled
  • Clear flatfield and verify state reset

🤖 Generated with Claude Code

hongquanli and others added 9 commits December 29, 2025 21:47
Ignore Claude Code project memory file.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add flatfield.py module with calculate/apply/save/load functions
- Add Flatfield Correction UI section between Preview and Settings
- Support two modes: Calculate from tiles (using BaSiCPy) or Load from file
- Per-channel flatfield with optional darkfield correction
- Auto-save calculated flatfield next to input file
- Add View button to visualize flatfield/darkfield in napari
- Integrate flatfield correction into TileFusion._read_tile()

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove enable/disable checkbox (presence of flatfield = enabled)
- Remove radio buttons for Calculate vs Load (both always available)
- Add auto-load of existing flatfield when input file is dropped
- Add Clear button to remove loaded flatfield
- Compact single-row layout for actions and status
- Remove redundant Save button (auto-save on calculate)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add Clear button next to View to reset flatfield state
- Auto-load existing flatfield when input file is dropped
- Initialize status label to "No flatfield" instead of empty

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add Clear button next to View to reset flatfield state
- Auto-load existing flatfield when input file is dropped
- Initialize status label to "No flatfield" instead of empty
- Increase minimum window height to 850 for better content fit

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
When constant_darkfield=True (default), the darkfield is reduced to a
single constant value (median) per channel. This is physically appropriate
since dark current is typically uniform across the sensor and more robust
than spatially varying darkfield estimation.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Use matplotlib instead of napari for flatfield/darkfield visualization
- First row: flatfield per channel, second row: darkfield
- Colorbars with vmin=0 for proper scale
- Show constant darkfield value in title when uniform
- Save to temp PNG and open with system viewer (thread-safe)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds flatfield correction functionality to the tilefusion library using BaSiCPy for illumination correction. The implementation provides both automatic calculation from sampled tiles and manual loading of pre-computed flatfield files.

Key Changes:

  • New flatfield.py module with BaSiCPy integration for calculating and applying flatfield/darkfield corrections
  • Core library integration to apply corrections during tile reading operations
  • Comprehensive GUI for flatfield management including calculation, loading, saving, and visualization

Reviewed changes

Copilot reviewed 4 out of 5 changed files in this pull request and generated 15 comments.

Show a summary per file
File Description
src/tilefusion/flatfield.py New module implementing flatfield calculation, application, and save/load operations with BaSiCPy
src/tilefusion/core.py Integration of flatfield correction into tile reading pipeline via optional parameters
src/tilefusion/init.py Export flatfield functions and HAS_BASICPY flag for public API
gui/app.py Complete GUI implementation with mode selection, worker threads, visualization, and auto-save/load features
.gitignore Added CLAUDE.md to ignore list

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 128 to 133
if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe

return corrected.astype(tile.dtype)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The apply_flatfield function performs division and subtraction operations that can produce floating-point values, but then casts the result back to the original integer dtype. This can lead to data loss and clipping issues. When working with integer dtypes (e.g., uint8, uint16), the intermediate floating-point values should be preserved or properly normalized to avoid losing precision and dynamic range after correction.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - dtype casting is intentional. Flatfield values are normalized around 1.0, so corrected values stay in similar range. Standard practice for microscopy flatfield correction.

else:
corrected = region / ff_safe

return corrected.astype(region.dtype)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The apply_flatfield_region function performs division and subtraction operations that produce floating-point values, but casts the result back to the original dtype. This can cause data loss when working with integer dtypes (uint8, uint16). The corrected values should either remain as float or be properly normalized to preserve precision.

Suggested change
return corrected.astype(region.dtype)
# Preserve floating-point precision: only cast back to original dtype
# when it is already a floating type. For integer inputs, return float32.
if np.issubdtype(region.dtype, np.floating):
return corrected.astype(region.dtype)
else:
return corrected.astype(np.float32)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - same as above, dtype casting is standard behavior for flatfield correction.

Comment on lines +1409 to +1411
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
plt.close(fig)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temporary file created for viewing the flatfield is never explicitly deleted. Using NamedTemporaryFile with delete=False leaves the file on disk after the viewer closes. This can accumulate temporary files over time. Consider using a cleanup mechanism or setting delete=True (though this may cause issues on Windows with some viewers).

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - low value change. Temp PNG files are small and cleaned by OS. Viewer needs file to persist while open.

Comment on lines +78 to +80
for ch in range(n_channels):
# Stack channel data from all tiles: shape (n_tiles, Y, X)
channel_stack = np.stack([tile[ch] for tile in tiles], axis=0)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The calculate_flatfield function doesn't validate that all tiles have compatible data types. If tiles have different dtypes (e.g., mix of uint8 and uint16), numpy stacking could cause unexpected type coercion or errors. Consider adding validation to ensure all tiles have the same dtype.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - NumPy handles dtype coercion automatically when stacking. BaSiCPy operates on float internally.

Comment on lines 1 to 226
"""
Flatfield correction module using BaSiCPy.

Provides functions to calculate and apply flatfield (and optionally darkfield)
correction for microscopy images.
"""

from pathlib import Path
from typing import List, Optional, Tuple

import numpy as np

try:
from basicpy import BaSiC

HAS_BASICPY = True
except ImportError:
HAS_BASICPY = False


def calculate_flatfield(
tiles: List[np.ndarray],
use_darkfield: bool = False,
constant_darkfield: bool = True,
) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Calculate flatfield (and optionally darkfield) using BaSiCPy.

Parameters
----------
tiles : list of ndarray
List of tile images, each with shape (C, Y, X).
use_darkfield : bool
Whether to also compute darkfield correction.
constant_darkfield : bool
If True, darkfield is reduced to a single constant value (median) per
channel. This is physically appropriate since dark current is typically
uniform across the sensor. Default is True.

Returns
-------
flatfield : ndarray
Flatfield correction array with shape (C, Y, X), float32.
darkfield : ndarray or None
Darkfield correction array with shape (C, Y, X), or None if not computed.
If constant_darkfield=True, each channel slice will be a constant value.

Raises
------
ImportError
If basicpy is not installed.
ValueError
If tiles list is empty or tiles have inconsistent shapes.
"""
if not HAS_BASICPY:
raise ImportError(
"basicpy is required for flatfield calculation. " "Install with: pip install basicpy"
)

if not tiles:
raise ValueError("tiles list is empty")

# Get shape from first tile
n_channels = tiles[0].shape[0]
tile_shape = tiles[0].shape[1:] # (Y, X)

# Validate all tiles have same shape
for i, tile in enumerate(tiles):
if tile.shape[0] != n_channels:
raise ValueError(f"Tile {i} has {tile.shape[0]} channels, expected {n_channels}")
if tile.shape[1:] != tile_shape:
raise ValueError(f"Tile {i} has shape {tile.shape[1:]}, expected {tile_shape}")

# Calculate flatfield per channel
flatfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32)
darkfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32) if use_darkfield else None

for ch in range(n_channels):
# Stack channel data from all tiles: shape (n_tiles, Y, X)
channel_stack = np.stack([tile[ch] for tile in tiles], axis=0)

# Create BaSiC instance and fit
basic = BaSiC(get_darkfield=use_darkfield, smoothness_flatfield=1.0)
basic.fit(channel_stack)

flatfield[ch] = basic.flatfield.astype(np.float32)

if use_darkfield:
if constant_darkfield:
# Use median value for constant darkfield (more robust than mean)
df_value = np.median(basic.darkfield)
darkfield[ch] = np.full(tile_shape, df_value, dtype=np.float32)
else:
darkfield[ch] = basic.darkfield.astype(np.float32)

return flatfield, darkfield


def apply_flatfield(
tile: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> np.ndarray:
"""
Apply flatfield correction to a tile.

Formula:
If darkfield is provided: corrected = (raw - darkfield) / flatfield
Otherwise: corrected = raw / flatfield

Parameters
----------
tile : ndarray
Input tile with shape (C, Y, X).
flatfield : ndarray
Flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield correction array with shape (C, Y, X).

Returns
-------
corrected : ndarray
Corrected tile with shape (C, Y, X), same dtype as input.
"""
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)

if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe

return corrected.astype(tile.dtype)


def apply_flatfield_region(
region: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray],
y_slice: slice,
x_slice: slice,
) -> np.ndarray:
"""
Apply flatfield correction to a tile region.

Parameters
----------
region : ndarray
Input region with shape (C, h, w) or (h, w).
flatfield : ndarray
Full flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Full darkfield correction array with shape (C, Y, X).
y_slice, x_slice : slice
Slices defining the region within the full tile.

Returns
-------
corrected : ndarray
Corrected region with same shape as input.
"""
# Extract corresponding flatfield/darkfield regions
if region.ndim == 2:
ff_region = flatfield[0, y_slice, x_slice]
df_region = darkfield[0, y_slice, x_slice] if darkfield is not None else None
else:
ff_region = flatfield[:, y_slice, x_slice]
df_region = darkfield[:, y_slice, x_slice] if darkfield is not None else None

# Avoid division by zero
ff_safe = np.where(ff_region > 1e-6, ff_region, 1.0)

if df_region is not None:
corrected = (region - df_region) / ff_safe
else:
corrected = region / ff_safe

return corrected.astype(region.dtype)


def save_flatfield(
path: Path,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> None:
"""
Save flatfield (and optionally darkfield) to a .npy file.

Parameters
----------
path : Path
Output path (should end with .npy).
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield array with shape (C, Y, X).
"""
data = {
"flatfield": flatfield.astype(np.float32),
"darkfield": darkfield.astype(np.float32) if darkfield is not None else None,
"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)


def load_flatfield(path: Path) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Load flatfield (and optionally darkfield) from a .npy file.

Parameters
----------
path : Path
Path to .npy file.

Returns
-------
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray or None
Darkfield array with shape (C, Y, X), or None if not present.
"""
data = np.load(path, allow_pickle=True).item()
flatfield = data["flatfield"]
darkfield = data.get("darkfield", None)
return flatfield, darkfield
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new flatfield correction functionality lacks test coverage. Given that the repository has comprehensive test suites for other modules (test_fusion.py, test_io.py, test_registration.py, etc.), the flatfield module should also have tests covering key scenarios like calculate_flatfield, apply_flatfield, save/load operations, and edge cases such as division by zero handling and invalid inputs.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - test coverage is a feature request, not a bug. Out of scope for code review fixes.

Comment on lines 99 to 133
def apply_flatfield(
tile: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> np.ndarray:
"""
Apply flatfield correction to a tile.

Formula:
If darkfield is provided: corrected = (raw - darkfield) / flatfield
Otherwise: corrected = raw / flatfield

Parameters
----------
tile : ndarray
Input tile with shape (C, Y, X).
flatfield : ndarray
Flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield correction array with shape (C, Y, X).

Returns
-------
corrected : ndarray
Corrected tile with shape (C, Y, X), same dtype as input.
"""
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)

if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe

return corrected.astype(tile.dtype)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The apply_flatfield and apply_flatfield_region functions don't validate that the flatfield and tile/region shapes are compatible. If the flatfield has a different number of channels or spatial dimensions than the tile, the operations will fail with unclear numpy broadcasting errors. Add input validation to provide clear error messages about shape mismatches.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit ed4e8d4 - added shape validation with clear error messages.

darkfield : ndarray or None
Darkfield array with shape (C, Y, X), or None if not present.
"""
data = np.load(path, allow_pickle=True).item()
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The load_flatfield function calls .item() on the loaded data, which will fail if the .npy file doesn't contain a dictionary object. This can happen if someone provides a regular numpy array file. Add error handling or validation to provide a clear error message when the file format is invalid.

Suggested change
data = np.load(path, allow_pickle=True).item()
loaded = np.load(path, allow_pickle=True)
try:
data = loaded.item()
except (AttributeError, ValueError) as exc:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a NumPy .npy file containing a dictionary as saved by "
"`save_flatfield` (with keys like 'flatfield' and 'darkfield')."
) from exc
if not isinstance(data, dict) or "flatfield" not in data:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a dictionary with at least a 'flatfield' entry."
)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit ed4e8d4 - added try/except and validation for invalid file formats.

Comment on lines 1364 to 1420
matplotlib.use("Agg") # Non-interactive backend
import matplotlib.pyplot as plt
import numpy as np
import tempfile
import subprocess

n_channels = self.flatfield.shape[0]
has_darkfield = self.darkfield is not None
n_rows = 2 if has_darkfield else 1

fig, axes = plt.subplots(n_rows, n_channels, figsize=(4 * n_channels, 4 * n_rows))

# Handle single channel case (axes not 2D)
if n_channels == 1 and n_rows == 1:
axes = [[axes]]
elif n_channels == 1:
axes = [[ax] for ax in axes]
elif n_rows == 1:
axes = [axes]

# First row: flatfield
for ch in range(n_channels):
ax = axes[0][ch]
im = ax.imshow(self.flatfield[ch], cmap="viridis", vmin=0)
ax.set_title(f"Flatfield Ch{ch}")
ax.axis("off")
plt.colorbar(im, ax=ax, fraction=0.046, pad=0.04)

# Second row: darkfield (if available)
if has_darkfield:
for ch in range(n_channels):
ax = axes[1][ch]
im = ax.imshow(self.darkfield[ch], cmap="magma", vmin=0)
# Show constant value in title if darkfield is uniform
df_val = self.darkfield[ch].ravel()[0]
if np.allclose(self.darkfield[ch], df_val):
ax.set_title(f"Darkfield Ch{ch} (={df_val:.1f})")
else:
ax.set_title(f"Darkfield Ch{ch}")
ax.axis("off")
plt.colorbar(im, ax=ax, fraction=0.046, pad=0.04)

plt.tight_layout()

# Save to temp file and open with system viewer
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
plt.close(fig)
# Open with default image viewer
if sys.platform == "darwin":
subprocess.Popen(["open", f.name])
elif sys.platform == "win32":
subprocess.Popen(["start", f.name], shell=True)
else:
subprocess.Popen(["xdg-open", f.name])

self.log("Opened flatfield viewer")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The matplotlib backend is set to 'Agg' (non-interactive) inside the view_flatfield method, which changes the global matplotlib state. If other parts of the application or future code use matplotlib with a different backend, this could cause conflicts. Consider saving and restoring the original backend, or using the backend context manager if available.

Suggested change
matplotlib.use("Agg") # Non-interactive backend
import matplotlib.pyplot as plt
import numpy as np
import tempfile
import subprocess
n_channels = self.flatfield.shape[0]
has_darkfield = self.darkfield is not None
n_rows = 2 if has_darkfield else 1
fig, axes = plt.subplots(n_rows, n_channels, figsize=(4 * n_channels, 4 * n_rows))
# Handle single channel case (axes not 2D)
if n_channels == 1 and n_rows == 1:
axes = [[axes]]
elif n_channels == 1:
axes = [[ax] for ax in axes]
elif n_rows == 1:
axes = [axes]
# First row: flatfield
for ch in range(n_channels):
ax = axes[0][ch]
im = ax.imshow(self.flatfield[ch], cmap="viridis", vmin=0)
ax.set_title(f"Flatfield Ch{ch}")
ax.axis("off")
plt.colorbar(im, ax=ax, fraction=0.046, pad=0.04)
# Second row: darkfield (if available)
if has_darkfield:
for ch in range(n_channels):
ax = axes[1][ch]
im = ax.imshow(self.darkfield[ch], cmap="magma", vmin=0)
# Show constant value in title if darkfield is uniform
df_val = self.darkfield[ch].ravel()[0]
if np.allclose(self.darkfield[ch], df_val):
ax.set_title(f"Darkfield Ch{ch} (={df_val:.1f})")
else:
ax.set_title(f"Darkfield Ch{ch}")
ax.axis("off")
plt.colorbar(im, ax=ax, fraction=0.046, pad=0.04)
plt.tight_layout()
# Save to temp file and open with system viewer
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
plt.close(fig)
# Open with default image viewer
if sys.platform == "darwin":
subprocess.Popen(["open", f.name])
elif sys.platform == "win32":
subprocess.Popen(["start", f.name], shell=True)
else:
subprocess.Popen(["xdg-open", f.name])
self.log("Opened flatfield viewer")
# Save current backend and temporarily switch to non-interactive backend
orig_backend = matplotlib.get_backend()
try:
matplotlib.use("Agg") # Non-interactive backend
import matplotlib.pyplot as plt
import numpy as np
import tempfile
import subprocess
n_channels = self.flatfield.shape[0]
has_darkfield = self.darkfield is not None
n_rows = 2 if has_darkfield else 1
fig, axes = plt.subplots(n_rows, n_channels, figsize=(4 * n_channels, 4 * n_rows))
# Handle single channel case (axes not 2D)
if n_channels == 1 and n_rows == 1:
axes = [[axes]]
elif n_channels == 1:
axes = [[ax] for ax in axes]
elif n_rows == 1:
axes = [axes]
# First row: flatfield
for ch in range(n_channels):
ax = axes[0][ch]
im = ax.imshow(self.flatfield[ch], cmap="viridis", vmin=0)
ax.set_title(f"Flatfield Ch{ch}")
ax.axis("off")
plt.colorbar(im, ax=ax, fraction=0.046, pad=0.04)
# Second row: darkfield (if available)
if has_darkfield:
for ch in range(n_channels):
ax = axes[1][ch]
im = ax.imshow(self.darkfield[ch], cmap="magma", vmin=0)
# Show constant value in title if darkfield is uniform
df_val = self.darkfield[ch].ravel()[0]
if np.allclose(self.darkfield[ch], df_val):
ax.set_title(f"Darkfield Ch{ch} (={df_val:.1f})")
else:
ax.set_title(f"Darkfield Ch{ch}")
ax.axis("off")
plt.colorbar(im, ax=ax, fraction=0.046, pad=0.04)
plt.tight_layout()
# Save to temp file and open with system viewer
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
plt.close(fig)
# Open with default image viewer
if sys.platform == "darwin":
subprocess.Popen(["open", f.name])
elif sys.platform == "win32":
subprocess.Popen(["start", f.name], shell=True)
else:
subprocess.Popen(["xdg-open", f.name])
self.log("Opened flatfield viewer")
finally:
# Restore original matplotlib backend
try:
matplotlib.use(orig_backend)
except Exception:
# If restoring fails, ignore to avoid masking the original error
pass

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - low priority. GUI doesn't use matplotlib interactively elsewhere. Backend set for non-interactive PNG generation only.

Comment on lines +1261 to +1269
self.flatfield_worker = FlatfieldWorker(
self.drop_area.file_path,
n_samples=50,
use_darkfield=self.darkfield_checkbox.isChecked(),
)
self.flatfield_worker.progress.connect(self.log)
self.flatfield_worker.finished.connect(self.on_flatfield_calculated)
self.flatfield_worker.error.connect(self.on_flatfield_error)
self.flatfield_worker.start()
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The FlatfieldWorker instance is stored as self.flatfield_worker but is never cleaned up after completion. If the user calculates the flatfield multiple times, the old worker threads remain referenced. Consider clearing the reference in the finished/error handlers or using a more robust thread management pattern.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - worker reference overwritten on next calculation, triggering garbage collection. Not a memory leak in practice.

"darkfield": darkfield.astype(np.float32) if darkfield is not None else None,
"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The save_flatfield function doesn't validate the input path or check if the parent directory exists. If the path points to a non-existent directory, the save operation will fail with an unclear error. Consider adding path validation or creating parent directories if needed.

Suggested change
}
}
# Ensure the parent directory exists before saving
parent = path.parent
if parent and not parent.exists():
parent.mkdir(parents=True, exist_ok=True)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - save dialog ensures valid paths. Parent directories typically exist when user selects via file dialog.

- Fix #9: Preserve flatfield data when checkbox is toggled off (just hide UI)
- Fix #10: Combine error message string into single line
- Fix #11: Add shape validation in apply_flatfield and apply_flatfield_region
- Fix #12: Add error handling in load_flatfield for invalid file formats

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 16 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 140 to 141
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The threshold value 1e-6 for avoiding division by zero may be too small for some data types or imaging conditions. Consider making this configurable or using a more robust approach like np.finfo(flatfield.dtype).eps or a small percentile of the flatfield values to avoid introducing artifacts in regions with legitimately low flatfield values.

Suggested change
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)
# Avoid division by zero using a dtype-aware epsilon
if np.issubdtype(flatfield.dtype, np.floating):
safe_eps = np.finfo(flatfield.dtype).eps
else:
# Fall back to a reasonable default for non-floating dtypes
safe_eps = np.finfo(np.float32).eps
flatfield_safe = np.where(flatfield > safe_eps, flatfield, 1.0)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - 1e-6 is reasonable for flatfield. Values near eps would indicate sensor defects, not legitimate low flatfield.

else:
corrected = tile / flatfield_safe

return corrected.astype(tile.dtype)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The dtype conversion in apply_flatfield may cause data loss or unexpected behavior. When applying flatfield correction, the corrected values can exceed the original dtype range (especially after division) or become negative (after darkfield subtraction). Consider either clipping the values to the valid range for the dtype or documenting this potential issue in the function docstring.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - same as previous review. Dtype casting is standard practice for flatfield correction.

Comment on lines +1402 to +1403
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temporary file created here is never explicitly deleted. While the OS may clean it up eventually, this could lead to accumulation of temporary files if the user views the flatfield multiple times. Consider using a context manager or explicitly tracking and deleting the temp file.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat comment, already addressed. Low value change.

self.progress.emit(f"Sampling {n_samples} tiles from {n_tiles} total...")

# Random sample of tile indices
rng = np.random.default_rng(42)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The random seed is hardcoded to 42. While this provides reproducibility, it means flatfield calculations will always use the same tiles. Consider making the seed configurable or documenting why a fixed seed is used, as users may want different sampling for verification purposes.

Suggested change
rng = np.random.default_rng(42)
seed_env = os.getenv("STITCHER_FLATFIELD_RNG_SEED")
try:
seed = int(seed_env) if seed_env is not None else 42
except (TypeError, ValueError):
seed = 42
rng = np.random.default_rng(seed)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - fixed seed ensures reproducibility for debugging. Different sampling rarely affects BaSiC results.

gui/app.py Outdated
tiles.append(tile)

self.progress.emit("Calculating flatfield with BaSiCPy...")
flatfield, darkfield = calculate_flatfield(tiles, use_darkfield=self.use_darkfield)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The constant_darkfield parameter defaults to True but is not exposed in the GUI. Users calculating flatfield through the GUI will always get constant darkfield without the option to change it. Consider either exposing this option in the GUI or documenting why the constant darkfield is always used.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat comment, already marked intentional. Constant darkfield is physically correct default.

"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using allow_pickle=True when saving enables pickling, which creates a security risk when the file is loaded. Since the data being saved is just numpy arrays and simple metadata (channels, shape), consider using np.savez instead, which doesn't require pickling and is more secure.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat of security concern. Saving is safe, format change would break backward compatibility.

self.calc_flatfield_button.setEnabled(False)
self.flatfield_status.setText("Calculating flatfield...")
self.flatfield_status.setStyleSheet("color: #ff9500; font-size: 11px;")

Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The n_samples parameter is hardcoded to 50 in the GUI. For datasets with very few tiles (e.g., < 50), this is fine since it's handled with min(). However, for very large datasets, 50 tiles may not be representative enough for accurate flatfield estimation. Consider exposing this parameter in the GUI or documenting the rationale for choosing 50 samples.

Suggested change
# NOTE: n_samples is intentionally fixed at 50 as a compromise between
# runtime and flatfield quality. The FlatfieldWorker internally uses
# min(total_tiles, n_samples), so small datasets are handled safely.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - 50 samples is empirically sufficient for BaSiC. Comment would add clutter for obvious code.

Comment on lines +257 to +262
except (AttributeError, ValueError) as exc:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a NumPy .npy file containing a dictionary as saved by "
"`save_flatfield` (with keys like 'flatfield' and 'darkfield')."
) from exc
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error handling catches AttributeError and ValueError, but np.load() can also raise OSError for file access issues. Consider also catching OSError to provide a more informative error message for file-not-found or permission issues.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 1728628 - added OSError handling with informative error message.

ValueError
If the file format is invalid (not a dictionary with 'flatfield' key).
"""
loaded = np.load(path, allow_pickle=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using allow_pickle=True when loading numpy files poses a security risk as it can execute arbitrary code during deserialization. Since the flatfield data is just numeric arrays and metadata, consider using a safer format like .npz without pickling, or validate the file structure more carefully before loading with pickle enabled.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat of security concern. Already addressed as false positive for local file loading.

self.flatfield_drop_area.clear()
self.view_flatfield_button.setEnabled(False)
self.clear_flatfield_button.setEnabled(False)
self.save_flatfield_button.setEnabled(False)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a new file is dropped, the flatfield is cleared but the checkbox remains checked. If flatfield correction is enabled but no flatfield is loaded/calculated, the stitching may proceed without correction or fail. Consider either disabling the checkbox when flatfield is cleared, or checking if flatfield data exists before using it in the workers.

Suggested change
self.save_flatfield_button.setEnabled(False)
self.save_flatfield_button.setEnabled(False)
# Ensure flatfield correction is not marked as enabled when no flatfield is loaded
self.flatfield_checkbox.setChecked(False)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 1728628 - checkbox now unchecked when flatfield cleared, re-checked when auto-loaded.

- Fix #8: Update docstring to clarify dtype casting behavior (may clip)
- Fix #14: Add OSError handling in load_flatfield for file access errors
- Fix #16: Sync flatfield checkbox state with loaded/cleared status

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 18 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 143 to 149
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)

if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The darkfield subtraction should be performed on float data before division. When tile is an unsigned integer type (e.g., uint16), the subtraction tile - darkfield can result in negative values that get clipped to 0 when cast back to the unsigned dtype at line 151. This could lead to incorrect correction results. Consider converting to float before applying the correction.

Suggested change
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)
if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe
# Perform computations in floating point to avoid underflow/clipping,
# especially for unsigned integer input types.
tile_float = tile.astype(np.float32, copy=False)
flatfield_float = flatfield.astype(np.float32, copy=False)
darkfield_float = None
if darkfield is not None:
darkfield_float = darkfield.astype(np.float32, copy=False)
# Avoid division by zero
flatfield_safe = np.where(flatfield_float > 1e-6, flatfield_float, 1.0)
if darkfield_float is not None:
corrected = (tile_float - darkfield_float) / flatfield_safe
else:
corrected = tile_float / flatfield_safe

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 4ed5f0b - now converts to float32 before arithmetic to avoid uint underflow.

gui/app.py Outdated
Comment on lines 1224 to 1225
# Uncheck flatfield correction when no flatfield is loaded
self.flatfield_checkbox.setChecked(False)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unchecking flatfield correction when a new file is loaded (line 1225) may be unexpected behavior. If a user had flatfield correction enabled and then switches to view a different file, they might expect the correction to remain enabled if they load a new flatfield. Consider preserving the checkbox state and only disabling it if explicitly cleared by the user.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - current behavior is correct: uncheck if no flatfield, re-check on successful auto-load.

save_ff(auto_save_path, self.flatfield, self.darkfield)
self.log(f"Auto-saved flatfield to {auto_save_path}")
except Exception as e:
self.log(f"Warning: Could not auto-save flatfield: {e}")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The warning message format should match other log messages in the application. Consider using "Warning:" with a capital W to be consistent with error handling patterns elsewhere in the GUI code.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - minor style nitpick, no existing "Warning:" pattern to match.

# Flatfield correction (optional)
self._flatfield = flatfield # Shape (C, Y, X) or None
self._darkfield = darkfield # Shape (C, Y, X) or None

Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When flatfield correction is enabled but the flatfield shape doesn't match the tile shape (e.g., if a flatfield was calculated from different-sized tiles), the error will occur deep in the read path. Consider validating the flatfield shape against tile dimensions when the TileFusion instance is created, and providing a clear error message early rather than failing during tile reading.

Suggested change
# Validate flatfield/darkfield spatial shape against tile dimensions, if available.
expected_tile_shape: Optional[Tuple[int, int]] = None
if hasattr(self, "tile_shape"):
expected_tile_shape = getattr(self, "tile_shape")
elif hasattr(self, "tile_size"):
expected_tile_shape = getattr(self, "tile_size")
elif hasattr(self, "tile_y") and hasattr(self, "tile_x"):
expected_tile_shape = (getattr(self, "tile_y"), getattr(self, "tile_x"))
# Only perform validation if we were able to infer an expected tile shape.
if expected_tile_shape is not None:
if self._flatfield is not None:
ff_shape = tuple(self._flatfield.shape)
if len(ff_shape) != 3 or ff_shape[1:] != tuple(expected_tile_shape):
raise ValueError(
f"Flatfield spatial shape {ff_shape[1:]} does not match expected "
f"tile shape {tuple(expected_tile_shape)}. Ensure the flatfield "
"was computed from tiles of the same size."
)
if self._darkfield is not None:
df_shape = tuple(self._darkfield.shape)
if len(df_shape) != 3 or df_shape[1:] != tuple(expected_tile_shape):
raise ValueError(
f"Darkfield spatial shape {df_shape[1:]} does not match expected "
f"tile shape {tuple(expected_tile_shape)}. Ensure the darkfield "
"was computed from tiles of the same size."
)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - validation at use-time (already added) provides clear errors. Early validation requires reading tile shape first.

Comment on lines 146 to 151
if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe

return corrected.astype(tile.dtype)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Flatfield correction is applied on every tile read, which may impact performance for large datasets. The division operation is computed in the original dtype space (line 147-149, 151), which means the computation is repeated for each read. Consider documenting the performance implications or adding an option to pre-process and cache corrected tiles if users need to read the same tile multiple times.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - flatfield is simple division, performance is acceptable. Documentation would add clutter.

Comment on lines +230 to +232
self._flatfield = flatfield # Shape (C, Y, X) or None
self._darkfield = darkfield # Shape (C, Y, X) or None

Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thread-safety concern: The flatfield and darkfield arrays are shared across threads without synchronization. While NumPy arrays are generally thread-safe for read-only access, the current implementation doesn't prevent a race condition if these arrays could be modified (e.g., via clear_flatfield in the GUI) while worker threads are reading tiles. Consider documenting that these should not be modified during processing, or add appropriate locking.

Suggested change
self._flatfield = flatfield # Shape (C, Y, X) or None
self._darkfield = darkfield # Shape (C, Y, X) or None
if flatfield is not None:
ff = np.array(flatfield, copy=True)
ff.setflags(write=False)
self._flatfield = ff # Shape (C, Y, X)
else:
self._flatfield = None
if darkfield is not None:
df = np.array(darkfield, copy=True)
df.setflags(write=False)
self._darkfield = df # Shape (C, Y, X)
else:
self._darkfield = None

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - GUI clears flatfield before workers start; no race in practice.

gui/app.py Outdated
Comment on lines 1224 to 1231
# Uncheck flatfield correction when no flatfield is loaded
self.flatfield_checkbox.setChecked(False)

# Auto-load existing flatfield if present
flatfield_path = path.parent / f"{path.stem}_flatfield.npy"
if flatfield_path.exists():
self.log(f"Found existing flatfield: {flatfield_path.name}")
self.on_flatfield_dropped(str(flatfield_path))
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The auto-load feature at line 1231 will load an existing flatfield but the checkbox was just unchecked at line 1225, creating potentially confusing behavior. If an existing flatfield is found and loaded, the correction should probably be enabled by default. The call to on_flatfield_dropped at line 1231 will enable it (line 1346), but this happens after it was explicitly disabled, which is confusing logic flow.

Suggested change
# Uncheck flatfield correction when no flatfield is loaded
self.flatfield_checkbox.setChecked(False)
# Auto-load existing flatfield if present
flatfield_path = path.parent / f"{path.stem}_flatfield.npy"
if flatfield_path.exists():
self.log(f"Found existing flatfield: {flatfield_path.name}")
self.on_flatfield_dropped(str(flatfield_path))
# Auto-load existing flatfield if present
flatfield_path = path.parent / f"{path.stem}_flatfield.npy"
if flatfield_path.exists():
self.log(f"Found existing flatfield: {flatfield_path.name}")
# Enable flatfield correction when an existing flatfield is found
self.flatfield_checkbox.setChecked(True)
self.on_flatfield_dropped(str(flatfield_path))
else:
# Uncheck flatfield correction when no flatfield is loaded
self.flatfield_checkbox.setChecked(False)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 4ed5f0b - checkbox only unchecked if no auto-load found.

Comment on lines +1406 to +1407
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temporary file created for viewing the flatfield is not explicitly cleaned up. While using delete=False, the file remains on disk after viewing. Consider implementing proper cleanup, either by tracking the temp file and deleting it on application exit, or by using a context manager pattern with explicit cleanup after the viewer is opened.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Low value change, temp files cleaned by OS.

Comment on lines 1 to 281
"""
Flatfield correction module using BaSiCPy.

Provides functions to calculate and apply flatfield (and optionally darkfield)
correction for microscopy images.
"""

from pathlib import Path
from typing import List, Optional, Tuple

import numpy as np

try:
from basicpy import BaSiC

HAS_BASICPY = True
except ImportError:
HAS_BASICPY = False


def calculate_flatfield(
tiles: List[np.ndarray],
use_darkfield: bool = False,
constant_darkfield: bool = True,
) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Calculate flatfield (and optionally darkfield) using BaSiCPy.

Parameters
----------
tiles : list of ndarray
List of tile images, each with shape (C, Y, X).
use_darkfield : bool
Whether to also compute darkfield correction.
constant_darkfield : bool
If True, darkfield is reduced to a single constant value (median) per
channel. This is physically appropriate since dark current is typically
uniform across the sensor. Default is True.

Returns
-------
flatfield : ndarray
Flatfield correction array with shape (C, Y, X), float32.
darkfield : ndarray or None
Darkfield correction array with shape (C, Y, X), or None if not computed.
If constant_darkfield=True, each channel slice will be a constant value.

Raises
------
ImportError
If basicpy is not installed.
ValueError
If tiles list is empty or tiles have inconsistent shapes.
"""
if not HAS_BASICPY:
raise ImportError(
"basicpy is required for flatfield calculation. Install with: pip install basicpy"
)

if not tiles:
raise ValueError("tiles list is empty")

# Get shape from first tile
n_channels = tiles[0].shape[0]
tile_shape = tiles[0].shape[1:] # (Y, X)

# Validate all tiles have same shape
for i, tile in enumerate(tiles):
if tile.shape[0] != n_channels:
raise ValueError(f"Tile {i} has {tile.shape[0]} channels, expected {n_channels}")
if tile.shape[1:] != tile_shape:
raise ValueError(f"Tile {i} has shape {tile.shape[1:]}, expected {tile_shape}")

# Calculate flatfield per channel
flatfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32)
darkfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32) if use_darkfield else None

for ch in range(n_channels):
# Stack channel data from all tiles: shape (n_tiles, Y, X)
channel_stack = np.stack([tile[ch] for tile in tiles], axis=0)

# Create BaSiC instance and fit
basic = BaSiC(get_darkfield=use_darkfield, smoothness_flatfield=1.0)
basic.fit(channel_stack)

flatfield[ch] = basic.flatfield.astype(np.float32)

if use_darkfield:
if constant_darkfield:
# Use median value for constant darkfield (more robust than mean)
df_value = np.median(basic.darkfield)
darkfield[ch] = np.full(tile_shape, df_value, dtype=np.float32)
else:
darkfield[ch] = basic.darkfield.astype(np.float32)

return flatfield, darkfield


def apply_flatfield(
tile: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> np.ndarray:
"""
Apply flatfield correction to a tile.

Formula:
If darkfield is provided: corrected = (raw - darkfield) / flatfield
Otherwise: corrected = raw / flatfield

Parameters
----------
tile : ndarray
Input tile with shape (C, Y, X).
flatfield : ndarray
Flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield correction array with shape (C, Y, X).

Returns
-------
corrected : ndarray
Corrected tile with shape (C, Y, X), cast back to the input dtype.
Note: Values are computed in float then cast to the original dtype,
which may result in clipping for values outside the dtype's valid
range (e.g., negative values clipped to 0 for unsigned types).

Raises
------
ValueError
If tile and flatfield shapes are incompatible.
"""
# Validate shapes
if tile.shape != flatfield.shape:
raise ValueError(
f"Tile shape {tile.shape} does not match flatfield shape {flatfield.shape}"
)
if darkfield is not None and tile.shape != darkfield.shape:
raise ValueError(
f"Tile shape {tile.shape} does not match darkfield shape {darkfield.shape}"
)

# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)

if darkfield is not None:
corrected = (tile - darkfield) / flatfield_safe
else:
corrected = tile / flatfield_safe

return corrected.astype(tile.dtype)


def apply_flatfield_region(
region: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray],
y_slice: slice,
x_slice: slice,
) -> np.ndarray:
"""
Apply flatfield correction to a tile region.

Parameters
----------
region : ndarray
Input region with shape (C, h, w) or (h, w).
flatfield : ndarray
Full flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Full darkfield correction array with shape (C, Y, X).
y_slice, x_slice : slice
Slices defining the region within the full tile.

Returns
-------
corrected : ndarray
Corrected region with same shape as input.

Raises
------
ValueError
If region and flatfield shapes are incompatible.
"""
# Validate channel count for 3D regions
if region.ndim == 3 and region.shape[0] != flatfield.shape[0]:
raise ValueError(
f"Region has {region.shape[0]} channels but flatfield has {flatfield.shape[0]} channels"
)

# Extract corresponding flatfield/darkfield regions
if region.ndim == 2:
ff_region = flatfield[0, y_slice, x_slice]
df_region = darkfield[0, y_slice, x_slice] if darkfield is not None else None
else:
ff_region = flatfield[:, y_slice, x_slice]
df_region = darkfield[:, y_slice, x_slice] if darkfield is not None else None

# Avoid division by zero
ff_safe = np.where(ff_region > 1e-6, ff_region, 1.0)

if df_region is not None:
corrected = (region - df_region) / ff_safe
else:
corrected = region / ff_safe

return corrected.astype(region.dtype)


def save_flatfield(
path: Path,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> None:
"""
Save flatfield (and optionally darkfield) to a .npy file.

Parameters
----------
path : Path
Output path (should end with .npy).
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield array with shape (C, Y, X).
"""
data = {
"flatfield": flatfield.astype(np.float32),
"darkfield": darkfield.astype(np.float32) if darkfield is not None else None,
"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)


def load_flatfield(path: Path) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Load flatfield (and optionally darkfield) from a .npy file.

Parameters
----------
path : Path
Path to .npy file.

Returns
-------
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray or None
Darkfield array with shape (C, Y, X), or None if not present.

Raises
------
OSError
If the file cannot be read (not found, permission denied, etc.).
ValueError
If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
except OSError as exc:
raise OSError(f"Cannot read flatfield file '{path}': {exc}") from exc

try:
data = loaded.item()
except (AttributeError, ValueError) as exc:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a NumPy .npy file containing a dictionary as saved by "
"`save_flatfield` (with keys like 'flatfield' and 'darkfield')."
) from exc

if not isinstance(data, dict) or "flatfield" not in data:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a dictionary with at least a 'flatfield' entry."
)

flatfield = data["flatfield"]
darkfield = data.get("darkfield", None)
return flatfield, darkfield
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new flatfield module lacks test coverage. Given that the repository has comprehensive tests for other modules (test_fusion.py, test_io.py, etc.), tests should be added for the flatfield functionality, including: tile validation, flatfield calculation with/without darkfield, constant darkfield reduction, flatfield application with different dtypes (especially unsigned integer types), save/load roundtrip, and error cases.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Test coverage out of scope for this PR fix cycle.

Comment on lines 143 to 145
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)

Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The threshold value 1e-6 for avoiding division by zero (line 144) might be too small for some data types. For float16 data, this is very close to the minimum positive normal value. Consider using a threshold relative to the flatfield's dtype or data range, or document why this specific value was chosen.

Suggested change
# Avoid division by zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0)
# Avoid division by zero / extremely small values.
# Use a dtype-aware threshold so that for low-precision floats (e.g. float16)
# we do not rely on a hard-coded absolute value that is too close to the
# smallest representable positive number.
if np.issubdtype(flatfield.dtype, np.floating):
finfo = np.finfo(flatfield.dtype)
# Use a small multiple of machine epsilon, but not smaller than 1e-6 for
# higher-precision dtypes to approximate the original behavior.
threshold = max(1e-6, 10 * float(finfo.eps))
else:
# For non-floating types, fall back to the original constant threshold.
threshold = 1e-6
flatfield_safe = np.where(flatfield > threshold, flatfield, 1.0)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - float16 not used in microscopy. uint8/uint16/float32 are standard. 1e-6 is fine.

- Fix #1 & #6: Convert to float32 before arithmetic to avoid uint underflow
- Fix #7: Clear flatfield/darkfield to None on load failure
- Fix #8: Support 2D arrays (Y, X) by auto-converting to 3D (1, Y, X)
- Fix #15: Only uncheck flatfield checkbox if no auto-load found

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 15 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using allow_pickle=True when loading .npy files poses a security risk. Pickle can execute arbitrary code during deserialization, making it vulnerable to malicious files. Since the flatfield data structure is simple (a dictionary with numpy arrays), consider using a safer format like npz (np.savez/np.load with allow_pickle=False) or HDF5, which don't have this vulnerability.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Local file loading from user-selected files is not a security risk.

gui/app.py Outdated
if sys.platform == "darwin":
subprocess.Popen(["open", f.name])
elif sys.platform == "win32":
subprocess.Popen(["start", f.name], shell=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On Windows, using shell=True with subprocess.Popen is a security risk, especially since the filename comes from user input (albeit through tempfile). While tempfile should generate safe names, using shell=True unnecessarily expands the attack surface. The correct Windows approach is subprocess.Popen(['cmd', '/c', 'start', '', f.name]) without shell=True, or use os.startfile(f.name) which is safer and more Pythonic.

Suggested change
subprocess.Popen(["start", f.name], shell=True)
os.startfile(f.name)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 103e5a9 - now uses os.startfile() on Windows.

# Save to temp file and open with system viewer
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
plt.close(fig)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Creating a new matplotlib figure for every view operation without proper cleanup, combined with using a non-interactive backend and immediately closing the figure, is inefficient. The figure is created, saved to disk, and closed, but matplotlib may still retain some state. While plt.close(fig) is called, consider explicitly setting matplotlib.pyplot.close('all') after saving to ensure all resources are released, especially if users view the flatfield multiple times.

Suggested change
plt.close(fig)
plt.close(fig)
plt.close("all")

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - plt.close(fig) is sufficient. plt.close('all') could affect other figures.

Comment on lines 1328 to 1347
def on_flatfield_dropped(self, file_path):
import numpy as np

try:
from tilefusion import load_flatfield

self.flatfield, self.darkfield = load_flatfield(file_path)
n_channels = self.flatfield.shape[0]
status = f"Loaded ({n_channels} channels)"
if self.darkfield is not None:
status += " + darkfield"
self.flatfield_status.setText(status)
self.flatfield_status.setStyleSheet(
"color: #34c759; font-size: 11px; font-weight: 600;"
)
self.view_flatfield_button.setEnabled(True)
self.clear_flatfield_button.setEnabled(True)
# Enable flatfield correction when successfully loaded
self.flatfield_checkbox.setChecked(True)
self.log(f"Loaded flatfield from {file_path}: {self.flatfield.shape}")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's no validation that the loaded flatfield/darkfield shapes are compatible with the input image before attempting stitching or preview. If a user loads a flatfield calculated from a different image with different tile dimensions or channel count, the shape mismatch will only be caught when apply_flatfield is called during tile reading, resulting in a runtime error during processing rather than immediate feedback. Consider adding shape validation when flatfield is loaded or when stitching/preview begins.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - shape validation at use-time already provides clear errors.

gui/app.py Outdated
tiles.append(tile)

self.progress.emit("Calculating flatfield with BaSiCPy...")
flatfield, darkfield = calculate_flatfield(tiles, use_darkfield=self.use_darkfield)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The calculate_flatfield function has a constant_darkfield parameter that defaults to True, but this parameter is not exposed in the GUI's FlatfieldWorker. Users always get constant darkfield behavior without the option to disable it. If the constant darkfield behavior is always desired, consider removing the parameter from the public API. Otherwise, add a UI control to let users choose between constant and spatially-varying darkfield correction.

Suggested change
flatfield, darkfield = calculate_flatfield(tiles, use_darkfield=self.use_darkfield)
flatfield, darkfield = calculate_flatfield(
tiles,
use_darkfield=self.use_darkfield,
constant_darkfield=True,
)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Constant darkfield is physically correct default.

Comment on lines +213 to +215
return corrected.astype(region.dtype)


Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The corrected values are cast back to the original dtype without explicit clipping. For unsigned integer types, negative values from (region - darkfield) will underflow, and values exceeding the dtype maximum will overflow. While this may wrap around for integer types, it can lead to incorrect pixel values after correction. Consider using np.clip to explicitly handle out-of-range values before casting.

Suggested change
return corrected.astype(region.dtype)
# Clip to the valid range of the original dtype before casting back
orig_dtype = region.dtype
if np.issubdtype(orig_dtype, np.integer):
info = np.iinfo(orig_dtype)
corrected = np.clip(corrected, info.min, info.max)
return corrected.astype(orig_dtype)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 103e5a9 - now explicitly clips to valid range for integer dtypes before casting.

Comment on lines 128 to 130
Note: Values are computed in float then cast to the original dtype,
which may result in clipping for values outside the dtype's valid
range (e.g., negative values clipped to 0 for unsigned types).
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The docstring states that values are "cast back to the input dtype" and mentions clipping "may result" in certain behavior, but this is misleading. The actual implementation uses astype() which doesn't clip - it wraps around for integer types. The documentation should accurately describe the actual behavior or the implementation should be fixed to match the documented behavior with explicit clipping.

Suggested change
Note: Values are computed in float then cast to the original dtype,
which may result in clipping for values outside the dtype's valid
range (e.g., negative values clipped to 0 for unsigned types).
Note: Values are computed in float and then cast to the original
dtype using ``numpy.ndarray.astype``. For integer dtypes this cast
does not clip but follows NumPy's usual overflow / wraparound rules
(e.g., ``-1`` becomes ``255`` for ``uint8``).

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 103e5a9 - docstring now accurately describes clipping behavior.

gui/app.py Outdated

self.progress.emit("Loading metadata...")

# Create TileFusion instance to read tiles
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When creating a TileFusion instance for flatfield calculation, no flatfield/darkfield parameters are passed. However, if the user has already calculated a flatfield in the GUI and then tries to calculate a new one from the same file, the TileFusion instance here won't apply any correction to the sampled tiles. This is actually correct behavior (you want raw tiles to calculate flatfield), but it might be worth adding a comment explaining this intentional omission to prevent future confusion.

Suggested change
# Create TileFusion instance to read tiles
# Create TileFusion instance to read tiles.
# NOTE: We intentionally do NOT pass any flatfield/darkfield parameters here.
# Even if the user has already computed a flatfield in the GUI for this file,
# flatfield estimation must be performed on raw, uncorrected tiles. Passing an
# existing correction would bias the new flatfield calculation.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 103e5a9 - added comment explaining intentional no-flatfield for raw tile sampling.

else:
corrected = tile_f / flatfield_safe

return corrected.astype(tile.dtype)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The corrected values are cast back to the original dtype without explicit clipping. For unsigned integer types, negative values from (tile - darkfield) will underflow, and values exceeding the dtype maximum will overflow. While the docstring mentions that values may be clipped, NumPy's astype doesn't clip by default - it wraps around for integer types. This can lead to incorrect pixel values after correction. Consider using np.clip to explicitly handle out-of-range values before casting.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 103e5a9 - same fix applied to apply_flatfield.

try:
import matplotlib

matplotlib.use("Agg") # Non-interactive backend
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting matplotlib backend to 'Agg' in a function that can be called multiple times will fail on subsequent calls if a different backend was already in use. The matplotlib.use() function raises an error if called after pyplot has been imported and used. Consider wrapping this in a try-except block or checking if the backend needs to be changed before calling use().

Suggested change
matplotlib.use("Agg") # Non-interactive backend
# Use a non-interactive backend when possible, but avoid changing it
# after matplotlib.pyplot has already been imported elsewhere.
if "matplotlib.pyplot" not in sys.modules:
try:
matplotlib.use("Agg") # Non-interactive backend
except Exception:
# If the backend cannot be changed (e.g., already in use),
# fall back to the existing backend without failing.
pass

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Backend set for non-interactive PNG generation only.

- Fix #2: Use os.startfile on Windows instead of shell=True
- Fix #7: Add comment about private method tf._read_tile usage
- Fix #11 & #14: Add explicit clipping before dtype cast for integer types
- Fix #12: Update docstring to accurately describe clipping behavior
- Fix #13: Add comment explaining intentional no-flatfield for raw sampling

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 7 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using allow_pickle=True when loading .npy files (line 275) can be a security risk as pickle can execute arbitrary code during deserialization. While this may be necessary for loading dictionary objects, consider documenting this security implication in the docstring and warning users to only load flatfield files from trusted sources. Alternatively, consider using a safer serialization format like JSON with separate .npy files for the arrays.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Local file loading from user-selected files.

Comment on lines 1229 to 1235
# Auto-load existing flatfield if present, otherwise disable correction
flatfield_path = path.parent / f"{path.stem}_flatfield.npy"
if flatfield_path.exists():
self.log(f"Found existing flatfield: {flatfield_path.name}")
self.on_flatfield_dropped(str(flatfield_path))
else:
self.flatfield_checkbox.setChecked(False)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The auto-load flatfield logic uses path.stem which may produce unexpected results when the input is a directory (SQUID folder). For directories, path.stem returns the directory name, not a filename without extension. This could lead to incorrect flatfield file paths like parent_dir/squid_folder_name_flatfield.npy instead of the expected location. Consider handling directory and file paths differently or using the actual dataset identifier instead of path.stem.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 0be9878 - now checks inside SQUID folder first, then next to it.

flatfield_path = path.parent / f"{path.stem}_flatfield.npy"
if flatfield_path.exists():
self.log(f"Found existing flatfield: {flatfield_path.name}")
self.on_flatfield_dropped(str(flatfield_path))
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When auto-loading an existing flatfield on line 1233, the flatfield_drop_area is not updated to reflect the loaded file. The drop area will still show "Drop flatfield .npy file here" even though a flatfield has been successfully loaded. Consider calling self.flatfield_drop_area.setFile(str(flatfield_path)) after successful auto-load to keep the UI state consistent.

Suggested change
self.on_flatfield_dropped(str(flatfield_path))
self.on_flatfield_dropped(str(flatfield_path))
# Keep flatfield drop area UI consistent with the auto-loaded file
self.flatfield_drop_area.setFile(str(flatfield_path))

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 0be9878 - drop area now updated after auto-load.

"color: #34c759; font-size: 11px; font-weight: 600;"
)
self.view_flatfield_button.setEnabled(True)
self.clear_flatfield_button.setEnabled(True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a flatfield is loaded from a file using on_flatfield_dropped, the save_flatfield_button is not enabled (line 1348 doesn't enable it). This means users cannot save a loaded flatfield to a different location. If this is intentional to prevent re-saving already-saved files, consider the use case where a user might want to copy or rename the flatfield file through the GUI. If not intentional, add self.save_flatfield_button.setEnabled(True) in the success path.

Suggested change
self.clear_flatfield_button.setEnabled(True)
self.clear_flatfield_button.setEnabled(True)
self.save_flatfield_button.setEnabled(True)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 0be9878 - save button now enabled after loading flatfield.

Comment on lines +1367 to +1368

matplotlib.use("Agg") # Non-interactive backend
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting the matplotlib backend to 'Agg' (non-interactive) inside the view_flatfield method might cause issues if matplotlib has already been initialized with a different backend elsewhere in the application. The matplotlib.use() call must happen before importing pyplot, but here pyplot is imported after. This could raise a warning or fail silently. Consider moving the backend configuration to module initialization or use a context manager approach.

Suggested change
matplotlib.use("Agg") # Non-interactive backend

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Backend set before pyplot import in the function.


# Enable flatfield checkbox
self.flatfield_checkbox = QCheckBox("Enable flatfield correction")
self.flatfield_checkbox.setChecked(True) # Default enabled
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The flatfield correction checkbox defaults to enabled (line 993), but according to the PR description under "UI Changes", the default should be enabled. However, the auto-load logic at line 1235 disables the checkbox when no existing flatfield is found. This creates inconsistent behavior: initially checked by default, but immediately unchecked if no flatfield file exists. Consider either defaulting to unchecked, or only disabling the checkbox when no file exists without changing the checkbox state (and showing a warning when trying to run with correction enabled but no flatfield loaded).

Suggested change
self.flatfield_checkbox.setChecked(True) # Default enabled
self.flatfield_checkbox.setChecked(False) # Default disabled; will be enabled if a flatfield is auto-loaded

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - current behavior is correct: initially checked, unchecked if no flatfield, re-checked on auto-load.

plt.tight_layout()

# Save to temp file and open with system viewer
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temp file created for viewing the flatfield uses delete=False but is never explicitly cleaned up. While the OS will eventually clean up temp files, consider using tempfile.mkstemp() or ensuring proper cleanup after the viewer opens the file. This could lead to accumulation of temp files if the user views the flatfield multiple times.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Low value change, OS handles temp cleanup.

- Fix #2: Handle SQUID directories by checking inside folder first, then next to it
- Fix #3: Update drop area UI after successful auto-load
- Fix #4: Enable save button after loading flatfield

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@hongquanli hongquanli requested a review from Copilot January 4, 2026 23:10
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 9 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The save_flatfield function uses allow_pickle=True when saving with np.save. Loading pickled data with allow_pickle=True is a security risk as it can execute arbitrary code. While this is required for saving dictionaries, consider documenting this security consideration or using a safer format like npz with named arrays (np.savez) instead.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Local files from user selection.

If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Loading with allow_pickle=True is a security risk as it can execute arbitrary code from untrusted .npy files. Consider using np.load with allow_pickle=False and switching to npz format (np.savez/np.load with allow_pickle=False), or at least document this security consideration prominently in the function docstring.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Local files from user selection.

df_region = darkfield[:, y_slice, x_slice] if darkfield is not None else None

# Convert to float32 to avoid underflow with unsigned integer types
region_f = region.astype(np.float32)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The same flatfield_safe calculation issue applies here. When flatfield values are near zero (< 1e-6), the correction defaults to dividing by 1.0, effectively skipping correction for those pixels. This behavior should be documented or the threshold should be explained in a comment.

Suggested change
region_f = region.astype(np.float32)
region_f = region.astype(np.float32)
# Avoid division by very small / zero flatfield values: for ff_region <= 1e-6,
# we fall back to 1.0, effectively skipping flatfield correction for those pixels.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit e7082fe - added comment explaining threshold behavior.

self.on_flatfield_dropped(str(flatfield_path))
self.flatfield_drop_area.setFile(str(flatfield_path))
else:
self.flatfield_checkbox.setChecked(False)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a flatfield file is not found, the flatfield correction checkbox is automatically disabled. This could be confusing for users who want to calculate a new flatfield or load one manually. Consider keeping the checkbox enabled and showing a message in the status label instead, or make this behavior more explicit in the UI.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - current UX is correct. User can still calculate or load flatfield.

Comment on lines +1423 to +1424
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temporary file is created with delete=False but is never explicitly cleaned up. This will leave temporary PNG files accumulating in the system's temp directory. Consider implementing cleanup, either after a delay or by tracking opened temp files and cleaning them up when the application closes.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. OS handles temp cleanup.

gui/app.py Outdated
try:
from tilefusion import load_flatfield

self.flatfield, self.darkfield = load_flatfield(file_path)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The file_path passed to load_flatfield should be converted to a Path object since load_flatfield expects a Path parameter. While it may work due to implicit conversion, it's better to be explicit for type safety and clarity.

Suggested change
self.flatfield, self.darkfield = load_flatfield(file_path)
self.flatfield, self.darkfield = load_flatfield(Path(file_path))

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit e7082fe - now explicitly converts to Path.

Comment on lines 1 to 296
"""
Flatfield correction module using BaSiCPy.

Provides functions to calculate and apply flatfield (and optionally darkfield)
correction for microscopy images.
"""

from pathlib import Path
from typing import List, Optional, Tuple

import numpy as np

try:
from basicpy import BaSiC

HAS_BASICPY = True
except ImportError:
HAS_BASICPY = False


def calculate_flatfield(
tiles: List[np.ndarray],
use_darkfield: bool = False,
constant_darkfield: bool = True,
) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Calculate flatfield (and optionally darkfield) using BaSiCPy.

Parameters
----------
tiles : list of ndarray
List of tile images, each with shape (C, Y, X) or (Y, X) for single-channel.
2D arrays are automatically converted to 3D with shape (1, Y, X).
use_darkfield : bool
Whether to also compute darkfield correction.
constant_darkfield : bool
If True, darkfield is reduced to a single constant value (median) per
channel. This is physically appropriate since dark current is typically
uniform across the sensor. Default is True.

Returns
-------
flatfield : ndarray
Flatfield correction array with shape (C, Y, X), float32.
darkfield : ndarray or None
Darkfield correction array with shape (C, Y, X), or None if not computed.
If constant_darkfield=True, each channel slice will be a constant value.

Raises
------
ImportError
If basicpy is not installed.
ValueError
If tiles list is empty or tiles have inconsistent shapes.
"""
if not HAS_BASICPY:
raise ImportError(
"basicpy is required for flatfield calculation. Install with: pip install basicpy"
)

if not tiles:
raise ValueError("tiles list is empty")

# Support 2D (Y, X) arrays by converting to 3D (1, Y, X)
tiles = [t[np.newaxis, ...] if t.ndim == 2 else t for t in tiles]

# Get shape from first tile
n_channels = tiles[0].shape[0]
tile_shape = tiles[0].shape[1:] # (Y, X)

# Validate all tiles have same shape
for i, tile in enumerate(tiles):
if tile.shape[0] != n_channels:
raise ValueError(f"Tile {i} has {tile.shape[0]} channels, expected {n_channels}")
if tile.shape[1:] != tile_shape:
raise ValueError(f"Tile {i} has shape {tile.shape[1:]}, expected {tile_shape}")

# Calculate flatfield per channel
flatfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32)
darkfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32) if use_darkfield else None

for ch in range(n_channels):
# Stack channel data from all tiles: shape (n_tiles, Y, X)
channel_stack = np.stack([tile[ch] for tile in tiles], axis=0)

# Create BaSiC instance and fit
basic = BaSiC(get_darkfield=use_darkfield, smoothness_flatfield=1.0)
basic.fit(channel_stack)

flatfield[ch] = basic.flatfield.astype(np.float32)

if use_darkfield:
if constant_darkfield:
# Use median value for constant darkfield (more robust than mean)
df_value = np.median(basic.darkfield)
darkfield[ch] = np.full(tile_shape, df_value, dtype=np.float32)
else:
darkfield[ch] = basic.darkfield.astype(np.float32)

return flatfield, darkfield


def apply_flatfield(
tile: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> np.ndarray:
"""
Apply flatfield correction to a tile.

Formula:
If darkfield is provided: corrected = (raw - darkfield) / flatfield
Otherwise: corrected = raw / flatfield

Parameters
----------
tile : ndarray
Input tile with shape (C, Y, X).
flatfield : ndarray
Flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield correction array with shape (C, Y, X).

Returns
-------
corrected : ndarray
Corrected tile with shape (C, Y, X), cast back to the input dtype.
For integer dtypes, values are clipped to the valid range before
casting (e.g., negative values clipped to 0 for unsigned types).

Raises
------
ValueError
If tile and flatfield shapes are incompatible.
"""
# Validate shapes
if tile.shape != flatfield.shape:
raise ValueError(
f"Tile shape {tile.shape} does not match flatfield shape {flatfield.shape}"
)
if darkfield is not None and tile.shape != darkfield.shape:
raise ValueError(
f"Tile shape {tile.shape} does not match darkfield shape {darkfield.shape}"
)

# Convert to float32 to avoid underflow with unsigned integer types
tile_f = tile.astype(np.float32)
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0).astype(np.float32)

if darkfield is not None:
corrected = (tile_f - darkfield.astype(np.float32)) / flatfield_safe
else:
corrected = tile_f / flatfield_safe

# Clip to valid range for integer dtypes to avoid wraparound
if np.issubdtype(tile.dtype, np.integer):
info = np.iinfo(tile.dtype)
corrected = np.clip(corrected, info.min, info.max)

return corrected.astype(tile.dtype)


def apply_flatfield_region(
region: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray],
y_slice: slice,
x_slice: slice,
) -> np.ndarray:
"""
Apply flatfield correction to a tile region.

Parameters
----------
region : ndarray
Input region with shape (C, h, w) or (h, w).
flatfield : ndarray
Full flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Full darkfield correction array with shape (C, Y, X).
y_slice, x_slice : slice
Slices defining the region within the full tile.

Returns
-------
corrected : ndarray
Corrected region with same shape as input.

Raises
------
ValueError
If region and flatfield shapes are incompatible.
"""
# Validate channel count for 3D regions
if region.ndim == 3 and region.shape[0] != flatfield.shape[0]:
raise ValueError(
f"Region has {region.shape[0]} channels but flatfield has {flatfield.shape[0]} channels"
)

# Extract corresponding flatfield/darkfield regions
if region.ndim == 2:
ff_region = flatfield[0, y_slice, x_slice]
df_region = darkfield[0, y_slice, x_slice] if darkfield is not None else None
else:
ff_region = flatfield[:, y_slice, x_slice]
df_region = darkfield[:, y_slice, x_slice] if darkfield is not None else None

# Convert to float32 to avoid underflow with unsigned integer types
region_f = region.astype(np.float32)
ff_safe = np.where(ff_region > 1e-6, ff_region, 1.0).astype(np.float32)

if df_region is not None:
corrected = (region_f - df_region.astype(np.float32)) / ff_safe
else:
corrected = region_f / ff_safe

# Clip to valid range for integer dtypes to avoid wraparound
if np.issubdtype(region.dtype, np.integer):
info = np.iinfo(region.dtype)
corrected = np.clip(corrected, info.min, info.max)

return corrected.astype(region.dtype)


def save_flatfield(
path: Path,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> None:
"""
Save flatfield (and optionally darkfield) to a .npy file.

Parameters
----------
path : Path
Output path (should end with .npy).
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield array with shape (C, Y, X).
"""
data = {
"flatfield": flatfield.astype(np.float32),
"darkfield": darkfield.astype(np.float32) if darkfield is not None else None,
"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)


def load_flatfield(path: Path) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Load flatfield (and optionally darkfield) from a .npy file.

Parameters
----------
path : Path
Path to .npy file.

Returns
-------
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray or None
Darkfield array with shape (C, Y, X), or None if not present.

Raises
------
OSError
If the file cannot be read (not found, permission denied, etc.).
ValueError
If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
except OSError as exc:
raise OSError(f"Cannot read flatfield file '{path}': {exc}") from exc

try:
data = loaded.item()
except (AttributeError, ValueError) as exc:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a NumPy .npy file containing a dictionary as saved by "
"`save_flatfield` (with keys like 'flatfield' and 'darkfield')."
) from exc

if not isinstance(data, dict) or "flatfield" not in data:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a dictionary with at least a 'flatfield' entry."
)

flatfield = data["flatfield"]
darkfield = data.get("darkfield", None)
return flatfield, darkfield
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new flatfield correction module lacks test coverage. Given that the repository has comprehensive test suites for other modules (test_fusion.py, test_io.py, test_registration.py, etc.), tests should be added for the flatfield module to cover key functionality including calculate_flatfield, apply_flatfield, save_flatfield, and load_flatfield functions, as well as edge cases like empty tiles, mismatched shapes, and error conditions.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Test coverage out of scope.

self.progress.emit(f"Sampling {n_samples} tiles from {n_tiles} total...")

# Random sample of tile indices
rng = np.random.default_rng(42)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using a fixed random seed (42) for sampling tiles means the same tiles will always be selected for flatfield calculation. While this provides reproducibility, it may not be ideal if users want to recalculate with different samples. Consider making the seed configurable or documenting why a fixed seed is used.

Suggested change
rng = np.random.default_rng(42)
# Optional configurability: use STITCHER_FLATFIELD_SEED env var if set.
seed_env = os.getenv("STITCHER_FLATFIELD_SEED")
try:
seed = int(seed_env) if seed_env is not None else None
except (TypeError, ValueError):
seed = None
rng = np.random.default_rng(seed)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Fixed seed ensures reproducibility.

Comment on lines +1261 to +1277
def calculate_flatfield(self):
if not self.drop_area.file_path:
return

self.calc_flatfield_button.setEnabled(False)
self.flatfield_status.setText("Calculating flatfield...")
self.flatfield_status.setStyleSheet("color: #ff9500; font-size: 11px;")

self.flatfield_worker = FlatfieldWorker(
self.drop_area.file_path,
n_samples=50,
use_darkfield=self.darkfield_checkbox.isChecked(),
)
self.flatfield_worker.progress.connect(self.log)
self.flatfield_worker.finished.connect(self.on_flatfield_calculated)
self.flatfield_worker.error.connect(self.on_flatfield_error)
self.flatfield_worker.start()
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the user clicks "Calculate Flatfield" multiple times quickly, multiple worker threads could be created. The old worker thread is not stopped or cleaned up before starting a new one. Consider checking if self.flatfield_worker is already running and either preventing a new calculation or stopping the previous worker first.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - edge case. Worker handles completion/error. Adding complexity for rare scenario.

- Fix #3: Add comment explaining flatfield_safe threshold behavior
- Fix #6: Convert file_path to Path explicitly for type safety

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 4 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

gui/app.py Outdated
from tilefusion import save_flatfield as save_ff

input_path = Path(self.drop_area.file_path)
auto_save_path = input_path.parent / f"{input_path.stem}_flatfield.npy"
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The auto-save path construction is inconsistent with the auto-load logic. When the input is a directory (SQUID folder), input_path.stem will be the directory name, but this doesn't match the auto-load logic which uses path.name for directories. This inconsistency means that auto-saved flatfields for directory inputs won't be auto-loaded when the directory is selected again. Consider using input_path.name instead to ensure consistency between auto-save and auto-load behavior.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 543f4df - auto-save now uses path.name for directories, consistent with auto-load.

gui/app.py Outdated
# Default filename based on input
default_name = "flatfield.npy"
if self.drop_area.file_path:
default_name = f"{Path(self.drop_area.file_path).stem}_flatfield.npy"
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The save dialog default path construction uses input_path.stem for both files and directories, which is inconsistent with auto-save behavior and may not produce intuitive filenames for directory inputs. When the input is a directory, Path.stem returns the directory name, but for files it returns the filename without extension. Consider handling directories and files separately, using input_path.name for directories to be consistent with the auto-save logic at line 1301.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 543f4df - save dialog now uses consistent path logic.

Comment on lines +1341 to +1361
def on_flatfield_dropped(self, file_path):
import numpy as np

try:
from tilefusion import load_flatfield

self.flatfield, self.darkfield = load_flatfield(Path(file_path))
n_channels = self.flatfield.shape[0]
status = f"Loaded ({n_channels} channels)"
if self.darkfield is not None:
status += " + darkfield"
self.flatfield_status.setText(status)
self.flatfield_status.setStyleSheet(
"color: #34c759; font-size: 11px; font-weight: 600;"
)
self.view_flatfield_button.setEnabled(True)
self.clear_flatfield_button.setEnabled(True)
self.save_flatfield_button.setEnabled(True)
# Enable flatfield correction when successfully loaded
self.flatfield_checkbox.setChecked(True)
self.log(f"Loaded flatfield from {file_path}: {self.flatfield.shape}")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a flatfield is loaded from file (either manually or via auto-load), there is no validation that the flatfield shape matches the actual tile dimensions. If a user loads a flatfield file that was calculated for different tile dimensions, this could lead to shape mismatch errors during stitching when apply_flatfield is called. Consider adding validation in the GUI when the flatfield is loaded to check that its spatial dimensions match the tile dimensions, or defer this check to when stitching begins with a more informative error message.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - shape validation at use-time provides clear errors. Early validation adds complexity.

self.on_flatfield_dropped(str(flatfield_path))
self.flatfield_drop_area.setFile(str(flatfield_path))
else:
self.flatfield_checkbox.setChecked(False)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The flatfield checkbox is checked by default (line 993), but when a new file is dropped without an existing flatfield file, the checkbox is unchecked (line 1244). This creates an inconsistency: the first time the GUI loads, flatfield is enabled by default, but after dropping a file without a flatfield, it's disabled. Consider either keeping it disabled by default for consistency, or maintaining the enabled state when no flatfield exists and displaying a helpful message.

Suggested change
self.flatfield_checkbox.setChecked(False)
# Keep the current flatfield checkbox state for consistency and
# inform the user that no existing flatfield was auto-detected.
self.log("No existing flatfield found; you can calculate or load one.")

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - current behavior is intuitive: enabled by default, disabled if no flatfield found.

- Fix #1: Auto-save uses path.name for directories, path.stem for files (consistent with auto-load)
- Fix #2: Save dialog uses same consistent path logic

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

self.fused_ts = None
self.center = None

# Flatfield correction (optional)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The flatfield and darkfield arrays should be validated against the tile shape when provided. If the shapes don't match, corrections will fail during tile reading with unclear error messages. Consider adding validation here to check that flatfield.shape matches (self.channels, self.Y, self.X) and that darkfield has the same shape if provided. This provides early, clear feedback rather than errors during tile processing.

Suggested change
# Flatfield correction (optional)
# Flatfield correction (optional)
if flatfield is not None or darkfield is not None:
expected_shape = (self.channels, self.Y, self.X)
if flatfield is not None and flatfield.shape != expected_shape:
raise ValueError(
f"flatfield.shape {flatfield.shape} does not match expected "
f"shape {expected_shape} (channels, Y, X)."
)
if darkfield is not None and darkfield.shape != expected_shape:
raise ValueError(
f"darkfield.shape {darkfield.shape} does not match expected "
f"shape {expected_shape} (channels, Y, X)."
)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 39448e8 - added early shape validation in TileFusion.init.

gui/app.py Outdated
tiles.append(tile)

self.progress.emit("Calculating flatfield with BaSiCPy...")
flatfield, darkfield = calculate_flatfield(tiles, use_darkfield=self.use_darkfield)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The calculate_flatfield function has a constant_darkfield parameter that defaults to True, but the GUI doesn't pass this parameter, relying on the default. While this works correctly, it would be clearer to explicitly pass constant_darkfield=True here to make the behavior visible to code readers and avoid confusion if the default ever changes.

Suggested change
flatfield, darkfield = calculate_flatfield(tiles, use_darkfield=self.use_darkfield)
flatfield, darkfield = calculate_flatfield(
tiles,
use_darkfield=self.use_darkfield,
constant_darkfield=True,
)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 39448e8 - now explicitly passes constant_darkfield=True.

- Fix #1: Add early shape validation for flatfield/darkfield in TileFusion.__init__
- Fix #2: Explicitly pass constant_darkfield=True for clarity

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 6 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 1429 to 1438
with tempfile.NamedTemporaryFile(suffix=".png", delete=False) as f:
fig.savefig(f.name, dpi=150, bbox_inches="tight")
plt.close(fig)
# Open with default image viewer
if sys.platform == "darwin":
subprocess.Popen(["open", f.name])
elif sys.platform == "win32":
os.startfile(f.name)
else:
subprocess.Popen(["xdg-open", f.name])
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The temporary file created with delete=False is never explicitly deleted. This could lead to accumulation of temporary PNG files in the system temp directory. Consider either setting delete=True and keeping the file handle open until after the viewer opens it, or implementing cleanup logic to remove old temp files.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. OS handles temp cleanup.

Comment on lines +1263 to +1279
def calculate_flatfield(self):
if not self.drop_area.file_path:
return

self.calc_flatfield_button.setEnabled(False)
self.flatfield_status.setText("Calculating flatfield...")
self.flatfield_status.setStyleSheet("color: #ff9500; font-size: 11px;")

self.flatfield_worker = FlatfieldWorker(
self.drop_area.file_path,
n_samples=50,
use_darkfield=self.darkfield_checkbox.isChecked(),
)
self.flatfield_worker.progress.connect(self.log)
self.flatfield_worker.finished.connect(self.on_flatfield_calculated)
self.flatfield_worker.error.connect(self.on_flatfield_error)
self.flatfield_worker.start()
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If a user clicks "Calculate Flatfield" multiple times rapidly, multiple FlatfieldWorker threads could be started simultaneously without proper cleanup of previous threads. Consider checking if self.flatfield_worker is already running before starting a new calculation, or disconnecting signals from the previous worker. This could lead to race conditions or unexpected behavior.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Edge case, worker handles completion.


# Create BaSiC instance and fit
basic = BaSiC(get_darkfield=use_darkfield, smoothness_flatfield=1.0)
basic.fit(channel_stack)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The call to basic.fit(channel_stack) is not wrapped in error handling. If BaSiCPy fails to fit (e.g., due to invalid data, insufficient samples, or numerical issues), the exception will propagate without a clear error message. Consider adding a try-except block to provide more context about which channel failed and why.

Suggested change
basic.fit(channel_stack)
try:
basic.fit(channel_stack)
except Exception as exc:
raise RuntimeError(
f"BaSiCPy flatfield fitting failed for channel {ch} "
f"with data shape {channel_stack.shape}"
) from exc

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 6ecef5d - BaSiCPy fit() now wrapped in try-except with channel context.

Comment on lines +1347 to +1375
def on_flatfield_dropped(self, file_path):
import numpy as np

try:
from tilefusion import load_flatfield

self.flatfield, self.darkfield = load_flatfield(Path(file_path))
n_channels = self.flatfield.shape[0]
status = f"Loaded ({n_channels} channels)"
if self.darkfield is not None:
status += " + darkfield"
self.flatfield_status.setText(status)
self.flatfield_status.setStyleSheet(
"color: #34c759; font-size: 11px; font-weight: 600;"
)
self.view_flatfield_button.setEnabled(True)
self.clear_flatfield_button.setEnabled(True)
self.save_flatfield_button.setEnabled(True)
# Enable flatfield correction when successfully loaded
self.flatfield_checkbox.setChecked(True)
self.log(f"Loaded flatfield from {file_path}: {self.flatfield.shape}")
except Exception as e:
# Clear any stale flatfield data on load failure
self.flatfield = None
self.darkfield = None
self.flatfield_status.setText(f"Load failed: {e}")
self.flatfield_status.setStyleSheet("color: #ff3b30; font-size: 11px;")
self.view_flatfield_button.setEnabled(False)
self.log(f"Error loading flatfield: {e}")
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When loading a flatfield file, there's no validation that it matches the dimensions of the currently selected input file. Users could load an incompatible flatfield and only discover the mismatch when they run stitching/preview. Consider adding validation in on_flatfield_dropped to check if a file is currently loaded and whether the flatfield dimensions match the expected tile shape.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Shape validation now in TileFusion.init provides early error.


if not tiles:
raise ValueError("tiles list is empty")

Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function converts 2D arrays to 3D but doesn't validate that tiles have the expected number of dimensions. If a tile has more than 3 dimensions (e.g., 4D array), it will pass through without error but may cause unexpected behavior. Consider adding validation to ensure tiles are either 2D or 3D arrays after the conversion.

Suggested change
# Validate tile dimensionality: only 2D (Y, X) or 3D (C, Y, X) arrays are supported
for i, t in enumerate(tiles):
if t.ndim not in (2, 3):
raise ValueError(
f"Tile {i} has {t.ndim} dimensions; expected 2 (Y, X) or 3 (C, Y, X)"
)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 6ecef5d - added validation for 2D/3D tile dimensionality.

gui/app.py Outdated
if sys.platform == "darwin":
subprocess.Popen(["open", f.name])
elif sys.platform == "win32":
os.startfile(f.name)
Copy link

Copilot AI Jan 4, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function os.startfile is only available on Windows. On non-Windows platforms, this will raise an AttributeError at runtime even though the code is in a Windows-specific conditional block. The issue is that the code path is determined at runtime, but the function might be referenced during bytecode compilation or linting. Consider using subprocess.Popen with platform-specific commands instead, or wrapping the call in a try-except block.

Suggested change
os.startfile(f.name)
subprocess.Popen(["cmd", "/c", "start", "", f.name], shell=True)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Fixed in commit 6ecef5d - now uses subprocess on Windows instead of os.startfile.

- Fix #3: Wrap BaSiCPy fit() in try-except with channel context
- Fix #5: Validate tile dimensionality (2D or 3D only)
- Fix #6: Use subprocess instead of os.startfile for Windows compatibility

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 4 out of 5 changed files in this pull request and generated 7 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using allow_pickle=True when loading data can be a security risk if the files come from untrusted sources, as it can execute arbitrary code. Consider using a more secure serialization format like npz without pickling, or at minimum document this security consideration in the docstring and validate the source of the files.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Local file loading from user-selected files is not a security risk.

Comment on lines +215 to +217
if region.ndim == 2:
ff_region = flatfield[0, y_slice, x_slice]
df_region = darkfield[0, y_slice, x_slice] if darkfield is not None else None
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When region is 2D, the code assumes flatfield has at least one channel and uses flatfield[0, ...]. However, there's no validation that flatfield.ndim == 3 or that it has at least one channel. If flatfield is malformed (e.g., 2D array), this will raise an IndexError. Consider adding validation for flatfield shape at the beginning of the function.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - flatfield always has ≥1 channel by construction from calculate_flatfield(). Empty flatfield is invalid data.

Comment on lines 1 to 311
"""
Flatfield correction module using BaSiCPy.

Provides functions to calculate and apply flatfield (and optionally darkfield)
correction for microscopy images.
"""

from pathlib import Path
from typing import List, Optional, Tuple

import numpy as np

try:
from basicpy import BaSiC

HAS_BASICPY = True
except ImportError:
HAS_BASICPY = False


def calculate_flatfield(
tiles: List[np.ndarray],
use_darkfield: bool = False,
constant_darkfield: bool = True,
) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Calculate flatfield (and optionally darkfield) using BaSiCPy.

Parameters
----------
tiles : list of ndarray
List of tile images, each with shape (C, Y, X) or (Y, X) for single-channel.
2D arrays are automatically converted to 3D with shape (1, Y, X).
use_darkfield : bool
Whether to also compute darkfield correction.
constant_darkfield : bool
If True, darkfield is reduced to a single constant value (median) per
channel. This is physically appropriate since dark current is typically
uniform across the sensor. Default is True.

Returns
-------
flatfield : ndarray
Flatfield correction array with shape (C, Y, X), float32.
darkfield : ndarray or None
Darkfield correction array with shape (C, Y, X), or None if not computed.
If constant_darkfield=True, each channel slice will be a constant value.

Raises
------
ImportError
If basicpy is not installed.
ValueError
If tiles list is empty or tiles have inconsistent shapes.
"""
if not HAS_BASICPY:
raise ImportError(
"basicpy is required for flatfield calculation. Install with: pip install basicpy"
)

if not tiles:
raise ValueError("tiles list is empty")

# Validate tile dimensionality: only 2D (Y, X) or 3D (C, Y, X) supported
for i, t in enumerate(tiles):
if t.ndim not in (2, 3):
raise ValueError(
f"Tile {i} has {t.ndim} dimensions; expected 2 (Y, X) or 3 (C, Y, X)"
)

# Support 2D (Y, X) arrays by converting to 3D (1, Y, X)
tiles = [t[np.newaxis, ...] if t.ndim == 2 else t for t in tiles]

# Get shape from first tile
n_channels = tiles[0].shape[0]
tile_shape = tiles[0].shape[1:] # (Y, X)

# Validate all tiles have same shape
for i, tile in enumerate(tiles):
if tile.shape[0] != n_channels:
raise ValueError(f"Tile {i} has {tile.shape[0]} channels, expected {n_channels}")
if tile.shape[1:] != tile_shape:
raise ValueError(f"Tile {i} has shape {tile.shape[1:]}, expected {tile_shape}")

# Calculate flatfield per channel
flatfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32)
darkfield = np.zeros((n_channels,) + tile_shape, dtype=np.float32) if use_darkfield else None

for ch in range(n_channels):
# Stack channel data from all tiles: shape (n_tiles, Y, X)
channel_stack = np.stack([tile[ch] for tile in tiles], axis=0)

# Create BaSiC instance and fit
basic = BaSiC(get_darkfield=use_darkfield, smoothness_flatfield=1.0)
try:
basic.fit(channel_stack)
except Exception as exc:
raise RuntimeError(
f"BaSiCPy flatfield fitting failed for channel {ch} "
f"with data shape {channel_stack.shape}"
) from exc

flatfield[ch] = basic.flatfield.astype(np.float32)

if use_darkfield:
if constant_darkfield:
# Use median value for constant darkfield (more robust than mean)
df_value = np.median(basic.darkfield)
darkfield[ch] = np.full(tile_shape, df_value, dtype=np.float32)
else:
darkfield[ch] = basic.darkfield.astype(np.float32)

return flatfield, darkfield


def apply_flatfield(
tile: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> np.ndarray:
"""
Apply flatfield correction to a tile.

Formula:
If darkfield is provided: corrected = (raw - darkfield) / flatfield
Otherwise: corrected = raw / flatfield

Parameters
----------
tile : ndarray
Input tile with shape (C, Y, X).
flatfield : ndarray
Flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield correction array with shape (C, Y, X).

Returns
-------
corrected : ndarray
Corrected tile with shape (C, Y, X), cast back to the input dtype.
For integer dtypes, values are clipped to the valid range before
casting (e.g., negative values clipped to 0 for unsigned types).

Raises
------
ValueError
If tile and flatfield shapes are incompatible.
"""
# Validate shapes
if tile.shape != flatfield.shape:
raise ValueError(
f"Tile shape {tile.shape} does not match flatfield shape {flatfield.shape}"
)
if darkfield is not None and tile.shape != darkfield.shape:
raise ValueError(
f"Tile shape {tile.shape} does not match darkfield shape {darkfield.shape}"
)

# Convert to float32 to avoid underflow with unsigned integer types
tile_f = tile.astype(np.float32)
# For flatfield values <= 1e-6, use 1.0 to avoid division by zero/near-zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0).astype(np.float32)

if darkfield is not None:
corrected = (tile_f - darkfield.astype(np.float32)) / flatfield_safe
else:
corrected = tile_f / flatfield_safe

# Clip to valid range for integer dtypes to avoid wraparound
if np.issubdtype(tile.dtype, np.integer):
info = np.iinfo(tile.dtype)
corrected = np.clip(corrected, info.min, info.max)

return corrected.astype(tile.dtype)


def apply_flatfield_region(
region: np.ndarray,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray],
y_slice: slice,
x_slice: slice,
) -> np.ndarray:
"""
Apply flatfield correction to a tile region.

Parameters
----------
region : ndarray
Input region with shape (C, h, w) or (h, w).
flatfield : ndarray
Full flatfield correction array with shape (C, Y, X).
darkfield : ndarray, optional
Full darkfield correction array with shape (C, Y, X).
y_slice, x_slice : slice
Slices defining the region within the full tile.

Returns
-------
corrected : ndarray
Corrected region with same shape as input.

Raises
------
ValueError
If region and flatfield shapes are incompatible.
"""
# Validate channel count for 3D regions
if region.ndim == 3 and region.shape[0] != flatfield.shape[0]:
raise ValueError(
f"Region has {region.shape[0]} channels but flatfield has {flatfield.shape[0]} channels"
)

# Extract corresponding flatfield/darkfield regions
if region.ndim == 2:
ff_region = flatfield[0, y_slice, x_slice]
df_region = darkfield[0, y_slice, x_slice] if darkfield is not None else None
else:
ff_region = flatfield[:, y_slice, x_slice]
df_region = darkfield[:, y_slice, x_slice] if darkfield is not None else None

# Convert to float32 to avoid underflow with unsigned integer types
region_f = region.astype(np.float32)
# For flatfield values <= 1e-6, use 1.0 to avoid division by zero/near-zero
ff_safe = np.where(ff_region > 1e-6, ff_region, 1.0).astype(np.float32)

if df_region is not None:
corrected = (region_f - df_region.astype(np.float32)) / ff_safe
else:
corrected = region_f / ff_safe

# Clip to valid range for integer dtypes to avoid wraparound
if np.issubdtype(region.dtype, np.integer):
info = np.iinfo(region.dtype)
corrected = np.clip(corrected, info.min, info.max)

return corrected.astype(region.dtype)


def save_flatfield(
path: Path,
flatfield: np.ndarray,
darkfield: Optional[np.ndarray] = None,
) -> None:
"""
Save flatfield (and optionally darkfield) to a .npy file.

Parameters
----------
path : Path
Output path (should end with .npy).
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray, optional
Darkfield array with shape (C, Y, X).
"""
data = {
"flatfield": flatfield.astype(np.float32),
"darkfield": darkfield.astype(np.float32) if darkfield is not None else None,
"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)


def load_flatfield(path: Path) -> Tuple[np.ndarray, Optional[np.ndarray]]:
"""
Load flatfield (and optionally darkfield) from a .npy file.

Parameters
----------
path : Path
Path to .npy file.

Returns
-------
flatfield : ndarray
Flatfield array with shape (C, Y, X).
darkfield : ndarray or None
Darkfield array with shape (C, Y, X), or None if not present.

Raises
------
OSError
If the file cannot be read (not found, permission denied, etc.).
ValueError
If the file format is invalid (not a dictionary with 'flatfield' key).
"""
try:
loaded = np.load(path, allow_pickle=True)
except OSError as exc:
raise OSError(f"Cannot read flatfield file '{path}': {exc}") from exc

try:
data = loaded.item()
except (AttributeError, ValueError) as exc:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a NumPy .npy file containing a dictionary as saved by "
"`save_flatfield` (with keys like 'flatfield' and 'darkfield')."
) from exc

if not isinstance(data, dict) or "flatfield" not in data:
raise ValueError(
f"Invalid flatfield file format at '{path}'. "
"Expected a dictionary with at least a 'flatfield' entry."
)

flatfield = data["flatfield"]
darkfield = data.get("darkfield", None)
return flatfield, darkfield
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The new flatfield correction functionality lacks test coverage. Given that this repository has comprehensive test coverage for other modules (test_fusion.py, test_registration.py, test_io.py, etc.), tests should be added for the flatfield module to maintain consistency. Consider adding tests for calculate_flatfield, apply_flatfield, apply_flatfield_region, save_flatfield, and load_flatfield functions.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Test coverage out of scope for this PR.

"channels": flatfield.shape[0],
"shape": flatfield.shape[1:],
}
np.save(path, data, allow_pickle=True)
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using allow_pickle=True when saving data can be a security risk if the files are shared or come from untrusted sources. Consider using a more secure serialization format like npz without pickling, or at minimum document this security consideration in the docstring.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Local file saving to user-selected path.

Comment on lines +240 to +244
if darkfield is not None and darkfield.shape != expected_shape:
raise ValueError(
f"darkfield.shape {darkfield.shape} does not match expected "
f"tile shape {expected_shape} (channels, Y, X)"
)
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Darkfield validation should check for None when flatfield is None. If only darkfield is provided without flatfield, it will pass validation but won't work correctly since apply_flatfield requires flatfield to be present. Consider adding validation to ensure darkfield is only provided when flatfield is also provided.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - independent validation is valid. Darkfield-only (without flatfield) is technically valid for dark current subtraction only.

channel_stack = np.stack([tile[ch] for tile in tiles], axis=0)

# Create BaSiC instance and fit
basic = BaSiC(get_darkfield=use_darkfield, smoothness_flatfield=1.0)
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The smoothness_flatfield parameter is hardcoded to 1.0. Different imaging conditions may benefit from different smoothness values. Consider making this a configurable parameter with a sensible default, allowing users to tune the smoothness for their specific use case.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - 1.0 is the appropriate default for BaSiCPy. Users can modify the parameter if needed for specific imaging conditions.

Comment on lines +161 to +162
# For flatfield values <= 1e-6, use 1.0 to avoid division by zero/near-zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0).astype(np.float32)
Copy link

Copilot AI Jan 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The threshold value 1e-6 for avoiding division by zero/near-zero may be too small for some use cases. Consider using a relative threshold based on the data type's precision or making this threshold configurable. For example, with float32 precision, values close to 1e-6 could still cause numerical issues.

Suggested change
# For flatfield values <= 1e-6, use 1.0 to avoid division by zero/near-zero
flatfield_safe = np.where(flatfield > 1e-6, flatfield, 1.0).astype(np.float32)
# Convert flatfield to float32 and use a dtype-aware epsilon to avoid division by zero/near-zero
flatfield_f = flatfield.astype(np.float32)
eps = np.finfo(flatfield_f.dtype).eps
flatfield_safe = np.where(np.abs(flatfield_f) > eps, flatfield_f, 1.0)

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Claude Code] Skipped - repeat. Threshold already commented; 1e-6 is appropriate for float32 flatfield values that are normalized around 1.0.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@hongquanli hongquanli merged commit d740ebb into main Jan 5, 2026
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants