Shortcuts: s show h hide n next p prev

Is Perl y.2k38 compliant?

r/perl

This post about y.2k38 got me wondering about how Perl will handle the date. From what I can tell localtime() and gmtime() appear to support dates past 2038. What else should we check?

My Date::Parse::Modern module supports parsing dates past 2038. I specifically wrote unit tests for WAY in the past and future.

submitted by /u/scottchiefbaker
[link] [comments]

JSON parse array of data

Perl questions on StackOverflow

I have the following program with JSON:

use strict;
use warnings;

use Data::Dumper qw( );
use JSON qw( );

my $json_text = '[
  {
    "sent": "2026-01-16T17:00:00Z",
    "data": [
      {
        "headline": "text1",
        "displayText": "text2"
      },
      {
        "displayText": "text3"
      },
      {
        "displayText": "text4"
      }
    ]
  },
  {
    "sent": "2026-01-16T17:00:00Z",
    "data": [
      {
        "headline": "text5",
        "displayText": "text6"
      },
      {
        "displayText": "text7"
      },
      {
        "displayText": "text8"
      },
      {
        "headline": "text9",
        "displayText": "text10"
      }
    ]
  }
]';

my $json = JSON->new;
my $data = $json->decode($json_text);

print Data::Dumper->Dump($data);

# This is pseudocode:
foreach ( $data->[] ) {
    print "\$_ is $_";
}

I would like to walk through elements in JSON and find all sent and all displayText values. But, I do not know how to dereference first element. First element is array without any name in this case.

Thunderhorse beta released!

dev.to #perl

Thunderhorse framework was just released as beta. Read more about it on my blog: https://bbrtj.eu/blog/article/thunderhorse-beta-released

Thunderhorse Beta released!

r/perl

How can we make this Moose faster?

blogs.perl.org

(I make no apologies for the ChatGPT images in my recent blog posts, by the way. No artists are missing out on being paid: I wasn’t going to hire an artist to illustrate these blog posts which will be read by like three people.)

A while back, I wrote MooseX::XSAccessor which you can add to Moose classes to inspect your attributes and try to replace the accessors with faster XS-based ones. Now I’ve done the same for constructors (new) and destructors (DESTROY) with MooseX::XSConstructor.

There are probably still bugs, but initial benchmarks look promising:

            Rate Standard       XS
Standard  6392/s       --     -43%
XS       11228/s      76%       --

They’re mostly just a matter of importing the modules and forgetting about them. If the modules detect that your class is too complex for the XS modules to handle, they’ll sit back and let the regular Moose code work its magic. Give them a try. Let me know if they break things. They usually won’t.

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications in Perl with a local development…

We are happy to announce that Geizhals Preisvergleich supports the German Perl Workshop 2026.

Geizhals Preisvergleich zählt zu den größten Produkt- und
Preisvergleichsplattformen im deutschsprachigen Raum.

Geizhals Preisvergleich

Geizhals Preisvergleich zählt zu den größten Produkt- und
Preisvergleichsplattformen im deutschsprachigen Raum.

Unser Unternehmen bewährt sich seit mittlerweile 25 Jahren in einer hoch
kompetitiven, dynamischen Branche.

Unseren Erfolg führen wir auf eine kompromisslose Umsetzung unserer
Unternehmensphilosophie zurück: Transparenz, objektive und
User-orientierte Information sowie Qualität unseres Contents.

Geizhals Preisvergleich ist ein mit der Heise Medien Gruppe verbundenes
Unternehmen.

How can we make this Moose faster?

dev.to #perl

(I make no apologies for the ChatGPT images in my recent blog posts, by the way. No artists are missing out on being paid: I wasn’t going to hire an artist to illustrate these blog posts which will be read by like three people.)

A while back, I wrote MooseX::XSAccessor which you can add to Moose classes to inspect your attributes and try to replace the accessors with faster XS-based ones. Now I’ve done the same for constructors (new) and destructors (DESTROY) with MooseX::XSConstructor.

There are probably still bugs, but initial benchmarks look promising:

              Rate   Standard     XS
Standard    6392/s         --   -43%
XS         11228/s        76%     --

They’re mostly just a matter of importing the modules and forgetting about them. If the modules detect that your class is too complex for the XS modules to handle, they’ll sit back and let the regular Moose code work its magic. Give them a try. Let me know if they break things. They usually won’t.

Rebuild embed.h last, in case other regen steps wanted to add more things to it

Otherwise, it's possible that later steps add new macros that `embed.pl`
would have wanted to add to its list of things to `#undef`, requiring a
second run of `make regen`.

By running embed.pl last, we ensure this happens in the right order the
first time.

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications in Perl with Google Cloud Run.

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications in Perl with Google Cloud Run.

Why not just use Python?

Python has traditionally been the main coding language for ML and AI tools. One of the strengths of the MCP protocol is that the actual implementation details are independent of the development language. The reality is that not every project is coded in Python- and MCP allows you to use the latest AI approaches with other coding languages.

Perl? Is that even a language anymore?

The goal of this article is to provide a minimal viable basic working MCP server in Perl that can be run without any unneeded extra code or extensions.

The Perl MCP module is here:

MCP-0.06

What Is Perl?

Perl is a general-purpose, high-level programming and scripting language, primarily known for its powerful text manipulation capabilities. Originally created by Larry Wall in 1987 for easier report processing, it has evolved to be used for a wide range of tasks, including system administration, web development, and network programming.

The main site for Perl is :

The Perl Programming Language - www.perl.org

Installing Perl

The step by step instructions vary by platform- for a basic Debian system here are the steps:

sudo apt-get install perl cpanminus

xbill@penguin:~/gemini-cli-codeassist/mcp-https-perl$ perl --version

This is perl 5, version 36, subversion 0 (v5.36.0) built for x86_64-linux-gnu-thread-multi
(with 60 registered patches, see perl -V for more detail)

Gemini CLI

If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:

npm install -g @google/gemini-cli

Testing the Gemini CLI Environment

Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:

gemini

Node Version Management

Gemini CLI needs a consistent, up to date version of Node. The nvm command can be used to get a standard Node environment:

GitHub - nvm-sh/nvm: Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions

Perl MCP Documentation

The official MCP CPAN page provides samples and documentation for getting started:

MCP

Where do I start?

The strategy for starting MCP development is a incremental step by step approach.

First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.

Then, a minimal Hello World Style Perl MCP Server is built with HTTP transport. This server is validated with Gemini CLI in the local environment.

This setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the MCP server both run in the same local environment.

Next- the basic MCP server is extended with Gemini CLI to add several new tools in standard code.

Setup the Basic Environment

At this point you should have a working Perl environment and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:

cd ~
git clone https://github.com/xbill9/gemini-cli-codeassist

Then run init.sh from the cloned directory.

The script will attempt to determine your shell environment and set the correct variables:

cd gemini-cli-codeassist
source init.sh

If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:

cd gemini-cli-codeassist
source set_env.sh

Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.

Hello World with HTTP Transport

One of the key features that the standard MCP libraries provide is abstracting various transport methods.

The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.

The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.

The connection over HTTP will look similar to this:

# Use HTTP transport
any '/mcp' => $server->to_action;

Perl Package Information

The Perl code depends on several standard libraries for MCP and logging:

requires 'Mojolicious::Lite';
requires 'MCP::Server';
requires 'JSON::MaybeXS';
requires 'WWW::Google::Cloud::Auth::ServiceAccount';
requires 'URI::Encode';
requires 'LWP::Protocol::https';

Installing and Running the Perl Code

Run the install make release target on the local system:

xbill@penguin:~/gemini-cli-codeassist/mcp-https-perl$ make
Installing dependencies...
--> Working on .
Configuring /home/xbill/gemini-cli-codeassist/mcp-https-perl ... OK
<== Installed dependencies for .. Finishing.
Running tests...
t/00_compile.t .. ok   
All tests successful.
Files=1, Tests=1, 1 wallclock secs ( 0.00 usr 0.00 sys + 0.21 cusr 0.03 csys = 0.24 CPU)
Result: PASS

To test the code:

xbill@penguin:~/gemini-cli-codeassist/mcp-https-perl$ make test
Running tests...
t/00_compile.t .. ok   
All tests successful.
Files=1, Tests=1, 0 wallclock secs ( 0.01 usr 0.01 sys + 0.18 cusr 0.05 csys = 0.25 CPU)
Result: PASS

Gemini CLI settings.json

In this example — the Perl source code uses a Perl interpreter that can be called directly from Gemini CLI.

The default Gemini CLI settings.json has an entry for the source:

{
  "mcpServers": {
    "hello-https-perl": {
      "url": "http://127.0.0.1:8080/mcp"
    }
  }
}

Validation with Gemini CLI

Finally- Gemini CLI is restarted and the MCP connection over HTTP to the Perl Code is validated, The full Gemini CLI Session will start:

> /mcp list

Configured MCP servers:

🟢 hello-https-perl - Ready (1 tool)
  Tools:
  - greet

> greet Camel!

✦ I will call the greet tool with the parameter "Camel!".

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ? greet (hello-https-perl MCP Server) {"param":"Camel!"} ← │
│ │
│ MCP Server: hello-https-perl │
│ Tool: greet │
│ │
│ Allow execution of MCP tool "greet" from server "hello-https-perl"? │
│ │
│ 1. Allow once │
│ 2. Allow tool for this session │
│ ● 3. Allow all server tools for this session │
│ 4. No, suggest changes (esc) │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ I will call the greet tool with the parameter "Camel!".

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (hello-https-perl MCP Server) {"param":"Camel!"} │
│ │
│ Camel! │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

Deploying to Cloud Run

After the HTTP version of the MCP server has been tested locally — it can be deployed remotely to Google Cloud Run.

First- switch to the directory with the HTTP MCP sample code:

cd ~/gemini-cli-codeassist/mcp-https-perl

Deploy the project to Google Cloud Run with the pre-built cloudbuild.yaml and Dockerfile:

cd ~/gemini-cli-codeassist/mcp-https-perl
xbill@penguin:~/gemini-cli-codeassist/mcp-https-perl$ make deploy

The Cloud Build will start:

> make deploy

✦ I will read the Makefile to understand the deploy target and any other relevant files like cloudbuild.yaml or Dockerfile.
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ ReadFile Makefile │
│ │
│ │
│ ✓ ReadFile cloudbuild.yaml │
│ │
│ │
│ ✓ ReadFile Dockerfile │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ I will execute make deploy to submit the build to Google Cloud Build and deploy it to Cloud Run.
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ⊶ Shell make deploy [current working directory /home/xbill/gemini-cli-codeassist/mcp-https-perl] (Submitti… (ctrl+f to focus) │
│ │
│ Step #0: d4d869c8f88d: Pulling fs layer │
│ Step #0: 8b22c905be5d: Pulling fs layer │
│ Step #0: 8b22c905be5d: Waiting │
│ Step #0: 7703c2863431: Verifying Checksum │
│ Step #0: 7703c2863431: Download complete │
│ Step #0: 8b22c905be5d: Verifying Checksum │
│ Step #0: 8b22c905be5d: Download complete │
│ Step #0: 7bbf972c6c2f: Verifying Checksum │
│ Step #0: 7bbf972c6c2f: Download complete │
│ Step #0: d4d869c8f88d: Verifying Checksum │
│ Step #0: d4d869c8f88d: Download complete │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

It can take 15–30 minutes to complete the build.

The cloud build needs to pull in all Perl libraries in the build environment and generate the entire package from scratch.

When the build is complete- an endpoint will be returned:

│ Starting Step #2 │
│ Step #2: Already have image (with digest): gcr.io/cloud-builders/gcloud │
│ Step #2: Deploying container to Cloud Run service [mcp-https-perl] in project [comglitn] region [us-central1] │
│ Step #2: Deploying... │
│ Step #2: Setting IAM Policy............done │
│ Step #2: Creating │
│ Revision...................................................................................................................... │
│ ..............................................................................................done │
│ Step #2: Routing traffic.....done │
│ Step #2: Done. │
│ Step #2: Service [mcp-https-perl] revision [mcp-https-perl-00002-pw8] has been deployed and is serving 100 percent of traffic. │
│ Step #2: Service URL: https://mcp-https-perl-1056842563084.us-central1.run.app │
│ Finished Step #2 │
│ PUSH │

The service endpoint in this example is :

https://mcp-https-perl-1056842563084.us-central1.run.app

The actual endpoint will vary based on your project settings.

Review Service in Cloud Run

Navigate to the Google Cloud console and search for Cloud Run -

and then you can detailed information on the Cloud Run Service:

Cloud Logging

The remote server writes logs to stderr in standard JSON format. These logs are available from the deployed Cloud Run Service:

Validate HTTP connection

Once you have the Endpoint — you can attempt a connection- navigate to in your browser:

https://mcp-https-perl-1056842563084.us-central1.run.app

You will need to adjust the exact URL to match the URL returned from Cloud Build.

You will get an error- this connection is expecting a message in the MCP format:

Page Not Found

Gemini CLI settings.json.cloudrun

Replace the default Gemini CLI configuration file —  settings.json with a pre-configured sample- settings.json.cloudrun to use the Cloud Run version of the connection:

 {
    "mcpServers": {
    "hello-cloudrun-perl": {
      "httpUrl": "https://mcp-https-perl-$PROJECT_NUMBER.us-central1.run.app/mcp"
    }
  }
}

Copy the Cloud Run version of the Gemini CLI configuration file:

xbill@penguin:~/gemini-cli-codeassist/mcp-https-perl$ cd .gemini
cp settings.json.cloudrun settings.json
xbill@penguin:~/gemini-cli-codeassist/mcp-https-perl/.gemini$

Validation with Gemini CLI

The final connection test uses Gemini CLI as a MCP client with the deployed Cloud Run Service in Perl providing the MCP server. Startup Gemini CLI with the updated settings :

gemini

> /mcp list

Configured MCP servers:

🟢 hello-cloudrun-perl - Ready (1 tool)
  Tools:
  - greet

> greet Swifties!

✦ I will call the greet tool to get a greeting for "Swifties!".

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ? greet (hello-cloudrun-perl MCP Server) {"param":"Swifties!"} ← │
│ │
│ MCP Server: hello-cloudrun-perl │
│ Tool: greet │
│ │
│ Allow execution of MCP tool "greet" from server "hello-cloudrun-perl"? │
│ │
│ 1. Allow once │
│ 2. Allow tool for this session │
│ ● 3. Allow all server tools for this session │
│ 4. No, suggest changes (esc) │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (hello-cloudrun-perl MCP Server) {"param":"Swifties!"} │
│ │
│ Swifties! │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ OK. I've greeted the Swifties

Summary

MCP development with Perl using Gemini CLI was validated with an incremental step by step approach.

A minimal streaming HTTP transport MCP Server was built from source code and validated with Gemini CLI running as a MCP client in the same local environment.

Then — the MCP server was wrapped in a container and submitted to Google Cloud Build for deployment to Google Cloud Run. The remote MCP server was validated with a standard browser, and Gemini CLI.

Finally- remote MCP operations were performed from the local Gemini CLI installation to the Perl MCP server hosted in Google Cloud Run.

This approach can be extended to more complex deployments and Cloud based options.

This week in PSC (211) | 2026-01-12

blogs.perl.org

Just Paul and Aristotle this week.

We mostly discussed the experimental refaliasing and declared_refs features to see if we can find a path towards declaring at least the latter non-experimental. This would be useful in its own right, as well as an enabler for PPC0034 “Ref-aliased parameters in subroutine signatures”.

[P5P posting of this summary]

regexec.c: Change return 0 to NULL

Perl commits on GitHub
regexec.c: Change return 0 to NULL

It is better practice to return a pointer when that's what the function
is declared to return

Document (and export) [IU]_(BITS|DIG)

Perl commits on GitHub
Document (and export) [IU]_(BITS|DIG)

Fixes #24083. Fixes #24084.

In d957e95daa0143d60933d96d6cbfb69eee6d6269 I changed the definitions of
IV_DIG and UV_DIG to depend on IV_BITS and UV_BITS respectively,
creating the latter in perl.h.  These had only visibility to the perl
core.  But I forgot that the _DIG forms were visible everywhere, so the
_BITS forms needed to be as well.

This commit merely documents all of them as public API (which should
have been the case all along anyway), which automatically causes their
visibility to be made everywhere.

Module::Load vs Module::Runtime for modern Perls

r/perl

I have long used Module::Runtime::use_module rather than Module::Load::load because the former has workarounds for various bugs in now-ancient versions of Perl which I needed to support.

I now no longer target Perl versions older than ~5.26, and would prefer switching to Module::Load, as it is part of core Perl.

Does Module::Runtime provide any benefit over Module::Load for recent versions of Perl?

(Module::Runtime::use_module rather handily returns the package name, but I can manage without that behavior.)

submitted by /u/djerius
[link] [comments]

scan_num: Macroize common code

Perl commits on GitHub
scan_num: Macroize common code

This avoids repeating code snippets.  It also changes things so adjacent
underscores are all absorbed at once (and warned about).  That means we
no longer have to keep track of if the previous character was an
underscore, so the variable that did that is removed.

Only two checks need be done for running off either end of the buffer.
The buffer is NUL-terminated, so if we see an underscore in the current
position, the next position exists (there is a NUL there if nothing
else); and the macro that looks behind one position is called in only
one place where we haven't always parsed beyond the first character.
scan_num: Replace code by equivalent function call

The previous commit has made this function, long in numeric.c,
available to the rest of core.  The code removed here duplicated what it
does.  Two variables are now unused, and are removed.

CPAN Report 2026

r/perl

Dave writes:

During December, I fixed assorted bugs, and started work on another tranche of ExtUtils::ParseXS fixups, this time focussing on:

  • adding and rewording warning and error messages, and adding new tests for them;

  • improving test coverage: all XS keywords have tests now;

  • reorganising the test infrastructure: deleting obsolete test files, renaming the t/*.t files to a more consistent format; splitting a large test file; modernising tests;

  • refactoring and improving the length(str) pseudo-parameter implementation.

By the end of this report period, that work was about half finished; it is currently finished and being reviewed.

Summary: * 10:25 GH #16197 re eval stack unwinding * 1:39 GH #23903 BBC: bleadperl breaks ETHER/Package-Stash-XS-0.30.tar.gz * 0:09 GH #23986 Perl_rpp_popfree_to(SV sp**) questionable design * 3:02 fix Pod::Html stderr noise * 27:47 improve Extutils::ParseXS * 1:47 modernise perlxs.pod

Total: * 44:49 (HH::MM)


Tony writes:

``` [Hours] [Activity] 2025/12/01 Monday 0.23 memEQ cast discussion with khw 0.42 #23965 testing, review and comment 2.03 #23885 review, testing, comments 0.08 #23970 review and approve 0.13 #23971 review and approve

0.08 #23965 follow-up

2.97

2025/12/02 Tuesday 0.73 #23969 research and comment 0.30 #23974 review and approve 0.87 #23975 review and comment 0.38 #23975 review reply and approve 0.25 #23976 review, research and approve 0.43 #23977 review, research and approve

1.20 #23918 try to produce expected bug and succeed

4.16

2025/12/03 Wednesday 0.35 #23883 check updates and approve with comment 0.72 #23979 review, try to trigger the messages and approve 0.33 #23968 review, research and approve 0.25 #23961 review and comment 2.42 #23918 fix handling of context, testing, push to update,

comment on overload handling plans, start on it

4.07

2025/12/04 Thursday 2.05 #23980 review, comment and approve, fix group_end() decorator and make PR 23983 0.25 #23982 review, research and approve 1.30 #23918 test for skipping numeric overload, and fix, start

on force overload

3.60

2025/12/05 Friday

0.63 #23980 comment

0.63

2025/12/08 Monday 0.90 #23984 review and comment 0.13 #23988 review and comment 2.03 #23918 work on force overload implmentation

1.45 #23918 testing, docs

4.51

2025/12/09 Tuesday 0.32 github notifications 1.23 #23918 add more tests 0.30 #23992 review 0.47 #23993 research, testing and comment

0.58 #23993 review and comment

2.90

2025/12/10 Wednesday 0.72 #23992 review updates, testing and comment 1.22 #23782 review (and some #23885 discussion in irc) 1.35 look into Jim’s freebsd core dump, reproduce and find cause, email him and briefly comment in irc, more 23885

discussion and approve 23885

3.29

2025/12/11 Thursday 0.33 #23997 comment 1.08 #23995 research and comment 0.47 #23998 review and approve

1.15 #23918 cleanup

3.03

2025/11/15 Saturday 0.20 #23998 review updates and approve 0.53 #23975 review comment, research and follow-up 1.25 #24002 review discussion, debugging and comment 0.28 #23993 comment 0.67 #23918 commit cleanup 0.20 #24002 follow-up

0.65 #23975 research and follow-up

3.78

2025/12/16 Tuesday 0.40 #23997 review, comment, approve 0.37 #23988 review and comment 0.95 #24001 debugging and comment 0.27 #24006 review and comment 0.23 #24004 review and nothing to say

1.27 #23918 more cleanup, documentation

3.49

2025/12/17 Wednesday 0.32 #24008 testing, debugging and comment 0.08 #24006 review update and approve 0.60 #23795 quick re-check and approve 1.02 #23918 more fixes, address each PR comment and push for CI 0.75 #23956 work on a test and a fix, push for CI 0.93 #24001 write a test, and a fix, testing 0.67 #24001 write an inverted test too, commit message and push for CI 0.17 #23956 perldelta 0.08 #23956 check CI results, make PR 24010

0.15 #24001 perldelta and make PR 24011

4.77

2025/12/18 Thursday 0.27 #24001 rebase, local testing, push for CI 1.15 #24012 research 0.50 #23995 testing and comment

0.08 #24001 check CI results and apply to blead

2.00

Which I calculate is 43.2 hours.

Approximately 32 tickets were reviewed or worked on, and 1 patches were applied. ```


Paul writes:

A mix of focus this month. I was hoping to get attributes-v2 towards something that could be reviewed and merged, but then I bumped into a bunch of refalias-related issues. Also spent about 5 hours reviewing Dave's giant xspod rewrite.

  • 1 = Rename THING token in grammar to something more meaningful
    • https://github.com/Perl/perl5/pull/23982
  • 4 = Continue work on attributes-v2
  • 1 = BBC Ticket on Feature-Compat-Class
    • https://github.com/Perl/perl5/issues/23991
  • 2 = Experiment with refalias parameters with defaults in XS-Parse-Sublike
  • 1 = Managing the PPC documents and overall process
  • 2 = Investigations into the refalias and declared_refs features, to see if we can un-experiment them
  • 2 = Add a warning to refalias that breaks closures
    • https://github.com/Perl/perl5/pull/24026 (work-in-progress)
  • 3 = Restore refaliased variables after foreach loop
    • https://github.com/Perl/perl5/issues/24028
    • https://github.com/Perl/perl5/pull/24029
  • 3 = Clear pad after multivariable foreach
    • https://github.com/Perl/perl5/pull/24034 (not yet merged)
  • 6 = Github code reviews (mostly on Dave's xspod)
    • https://github.com/Perl/perl5/pull/23795

Total: 25 hours

Understanding TPRF's Finance, 2026 Edition

blogs.perl.org

An Analysis of The Perl and Raku Foundation's 2024 Finances

In October 2024, I published an article analyzing the financial situation of The Perl and Raku Foundation (TPRF). Since then, I have left the board, and my life is now largely unrelated to Perl. I no longer have insight into TPRF's internal decision-making but I got a few suggestions to continue, so this article again analyzes TPRF's finances using publicly available data for the 2024 calendar year. There is an unavoidable delay between when nonprofit tax returns are filed and when they become public.

Executive Summary

  • Assets at end of 2023: $200,215
  • Revenue in 2024: $86,845
  • Expenses in 2024: $188,037
  • Assets at end of 2024: $101,525
Despite a strong increase in donations, TPRF spent more than twice its revenue in 2024, resulting in a $98,690 loss and a halving of its assets.

Revenue: A Positive Turn

revenue
Total revenue in 2024 was $86,845, more than double the $41,442 recorded in 2023.
The increase came almost entirely from contributions (donations), which rose from $24,395 in 2023 to $69,911 in 2024. This suggests that fundraising efforts were effective. Of that amount, I sourced $10,000.

Program services revenue, which is primarily revenue from conference ticket sales, remained stable. It was $16,581 in 2024, compared to $15,903 in 2023.

2024-revenue-table.png

Expenses: Still high

2024-expenses.png

Total expenses in 2024 were $188,037, roughly the same as in 2023 ($181,655) and the highest level of spending since 2019.

The two largest categories continue to be Grants ($89,944, 47.8%) and Conferences ($62,365, 33.2%). Together, these two categories accounted for over 80% of total spending.

The foundation spent nearly $190,000 in a year where it earned only $87,000.

2024-expenses-table.png

Assets: Cut in Half

At the end of 2023, TPRF held $200,215 in assets. By the end of 2024, that figure had dropped to $101,525.

In a single year, the foundation lost $98,690. TPRF would run out of money if this trend continued.

2024-expenses-revenue.png
2024-assets.png

Cost Cutting: What Changed

TPRF does not publish a detailed expense breakdown, which makes external analysis difficult. That said, based on public information, several cost-saving changes appear to have occurred in 2024:

  • The foundation stopped hiring an intern to support diversity in open-source software, saving $8,000 per year. This is unfortunate, but financially meaningful.
  • Previous grant programs for work outside of core language development seem to have ended. I have not seen any calls for such grants since January 2024.
  • I was also told that the 2025 conference was organized at a significantly lower cost than in prior years, which should help going forward.

TPRF sponsors a free "community dinner" at FOSDEM. I voted against continuing this practice, arguing that foundation funds should prioritize code development and infrastructure rather than social activities that benefit only attendees. Despite this, the board continues to host dinner.

Transparency

Beyond high-level numbers, it is virtually impossible to understand how the foundation spends its money.

There was a proposal by @Tib to leverage Open Collective, which would publicly show where funds come from and how they are spent. This would also make earmarked donations automatic and transparent, including clear allocation between Perl and Raku, which was requested at the previous conferences.

Increased transparency is critical for several reasons:

  • Donor trust: When donors can see exactly how their contributions are used, they are more confident that their money is making an impact.
  • Accountability: Publicly visible spending ensures that the foundation's leadership is held responsible for financial decisions and priorities.
  • Long-term sustainability: Transparent reporting makes it easier to plan budgets, attract new donors, and justify continued support.
  • Clear communication between communities: Transparency allows both Perl and Raku communities to understand how resources are allocated, preventing misunderstandings or perceived favoritism.

In my opinion, increasing transparency is not optional; it is essential for the foundation's credibility and the health of the communities it serves.

A Call to the Community

My employer donates to TPRF through a gift-matching program every year and I deeply appreciate that support.

Many tech companies in the 2000s relied heavily on Perl, often at no cost. Supporting its ongoing development is a way of saying thank you. It is never too late.

Taking the Win - Perl in the TIOBE Index

blogs.perl.org

There has been much to say about Perl improving in TIOBE during 2025 and ending in the top 10 which is roughly where things were around 2016.

Many things are working well in the Perl community and we should expect to be seeing them paying off.

For example:


  • The Perl Steering Committee is now several years in and is working effectively

  • New releases continue their consistent, predictable, and reliable cadence

  • Backward compatibility is still a priority, and breaks happen with care and ample warning.

  • MetaCPAN and CPAN are being well run and attracting regular donations

  • The Perl and Raku foundation continues in its mission. Continuing to run events and engage with the community

  • Perl-centric events continue regularly in America, Europe, and Asia. New events have also sprung up with both in person and online attendance options

  • Several grants are adding considerable value to the maintenance of Perl

  • Perl continues to be maintained as an official package in every Linux distro, BSD, and is available for Windows. Perl continues to be very viable on a range of systems outside those, some commercial others proprietary

  • The Perl langauge has excellent support in major editors/IDE such as Vscode (it is by far the most popular), Vim, Nevim, Emacs, IntelliJ and others.

  • The various community channels are more helpful and friendly than ever.

  • All major AI platforms can help you with Perl, which excellent results. Never has it been easier to decipher Perl code, to have something patiently explained, or to have detailed reviews of your code

  • Perl critic and Perltidy are more relevant than ever in the age of AI. You can define a policy to automatically avoid many of Perls pitfalls and to ensure your code is formatted in a consistent way to suit your needs

Be skeptical of TIOBE by all means, but let's not grasp defeat from the jaws of victory.

Let's take the Win!

Edit: Mentioned events. Which deserve a mention.

Manage Environment Configs

dev.to #perl

TL;DR

An app’s config is everything that is likely to vary between deploys (staging, production, developer environments, etc).
The Twelve-Factor App

Storing the often changing parts of configuration in environment variables is
one of the principles of The Twelve-Factor App.

From this principle follows the need to:

  1. ensure that all the required environment variables are set with
    appropriate values, and

  2. store those environment variables and their values in easily accessible ways suitable both for development and for running in production.

Both are typical DevOps problems. To help solve them, use Env::Assert and Env::Dot, two programs which, while doing two very different things, are designed to work in unison.

Env::Assert

Env::Assert was born from frustration. One too many times:

$ PLAEC='Stockholm'
$ if [[ "$PLACE" == '' ]]; then echo "Normal OK"; fi
OK

... And the program fails with no errors!

Not quite what we want!

Another example, from a real life Docker execution script:

perl -Ilib bin/repos-gh-yaml.pl --verbose         \
    | perl -Ilib bin/repos-yaml-csv.pl --verbose  \
    | az storage blob upload --data @-            \
        --content-type 'text/csv'                 \
        --content-encoding 'UTF-8'                \
        --content-language 'en_US'                \
        --name "$blob_name"                       \
        --container "$CONTAINER_NAME"             \
        --account-name "$AZURE_STORAGE_ACCOUNT"   \
        --sas-token "$AZURE_STORAGE_SAS_TOKEN"

If the environment variables are wrongly set, or not set at all, it won't become evident until after the run has started. It could take hours before the run reaches the point when they are used.

Describe The Environment

Env::Assert, or rather the executable envassert, that comes with it provide an easy way to find out if the environment variables are what we require them to be.

envassert is a CLI command to assert that your environment variables match your Environment Description.

Envdesc or Environment Description is a way to describe which environment variables are required by your program.

Environment Description is written in a file. Default file name is .envdesc.

.envdesc actually looks a lot like a .env file, except instead of
defining variables and their content, it defines regular expressions
which control the variables' content. These regexps are Perl's
extended regular expressions (m/<regexp>/msx).

Example .envdesc:

CONTAINER_NAME=^[a-z0-9-]{1,}$
AZURE_STORAGE_ACCOUNT=^[a-z0-9]{1,}$
AZURE_STORAGE_SAS_TOKEN=^[?].*$
GITHUB_TOKEN=^[[:word:]]{1,}$

In normal circumstances, envassert only verifies the variables that you specifically describe. If you want more control over your environment, there is the meta command envassert (opts: exact=1)
which will make envassert also assert that the environment doesn't contain any unknown variables.

## envassert (opts: exact=1)
USER=^username$
HOME=^/home/username$
PATH=^/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin$

Running Env::Assert

You can create an airtight environment description to verify environment variables in both test and production. Just run envassert as the first command during container execution or any script run:

envassert --env-description /home/me/.envdesc \
    || ( echo 'Break execution ...' 1>&2 && exit 1 )

If it detects problems, envassert will report errors and exit with an error, e.g.:

$ envassert
Environment Assert: ERRORS:
    variables:
        FIRST_VAR: Variable FIRST_VAR is missing from environment
        FOURTH_VAR: Variable FOURTH_VAR has invalid content

Running Self-Contained

A .envdesc file is really convenient for a bigger app which may have many disconnected parts and execution scripts. But if you have only a single script which nevertheless is dependent on having certain predefined environment variables, you can also include the .envdesc file in the script. An example:

#!/usr/bin/env sh
envassert --stdin <<'EOF' # Ensure the required environment.
NUMERIC_VAR=^[[:digit:]]+$
TIME_VAR=^\d{2}:\d{2}:\d{2}$
EOF
echo "${NUMERIC_VAR}: ${TIME_VAR}"

Using Env::Assert in a Program

Env::Assert is a Perl language module. If your application is a Perl script or package, you can also call Env::Assert directly in the code.

If you know you will always have a .envdesc file in the working directory, call:

use Env::Assert 'assert';

But it would probably be better to specify the Environment Description file. Other parameters are also available. break_at_first_error will make Env::Assert to only report the first error it detects:

use Env::Assert assert => {
    envdesc_file => 'another-envdesc',
    break_at_first_error => 1,
};

Inlining the description file is also possible:

use Env::Assert assert => {
    exact => 1,
    envdesc => <<'EOF'
NUMERIC_VAR=^[[:digit:]]+$
TIME_VAR=^\d{2}:\d{2}:\d{2}$
EOF
};

Env::Dot

Env::Dot is the other piece of the puzzle, the one which will provide the environment repeatably and reliably.

There is plenty of existing DotEnv solutions. Env::Dot, however, can offer a few unique features. The .env files are treated more like source files, not as ready shell (Unix standard sh or Bash) files. With meta commands user can specify if the .env file is compatible with shell or is written in the more limited format that Docker is using:

For standard shell:

# envdot (file:type=shell)
VAR="value"

For Docker:

# envdot (file:type=plain)
VAR=My var value

You can chain .env files. When seeing meta command read::from_parent**Env::Dot** will search for another.env` file in any parent directory. It will load the first .env file it finds from the current directory upwards to root. If you have several applications in different subdirectory which share some environment variables but also have some unique ones, you can place the common ones in the parent directory and refer to it:

# envdot (read:from_parent)
DIR_VAR="dir"
COMMON_VAR="dir"

Env::Dot uses environment variable ENVDOT_FILEPATHS to read dotenv files located somewhere else than in the current work dir. You can specify several file paths; just separate them by ":". Env::Dot will load the files in the reverse order, starting from the last. This is the same ordering as used in PATH variable: the first overrules the following ones, that is, when reading from the last path to the first path, if same variable is present in more than one file, the later one replaces the one already read.

If you are using Windows, separate the paths by ";"!

For example, if you have the following directory structure:

project-root
| .env
+ - sub-project
  | .env

and you specify ENVDOT_FILEPATHS=project-root/sub-project/.env:project-root/.env, then the variables in file project-root/.env will get replaced by the more specific variables in project-root/sub-project/.env.

In Windows, this would be ENVDOT_FILEPATHS=project-root\sub-project\.env;project-root\.env

Env::Dot Executable

Use executable envdot to bring the variables into your shell.
The executable is distributed together with Env::Dot package.

envdot supports the following Unix shells: sh and its derivatives, including bash and ksh, csh and its derivative tcsh', and fish`.

Normally the variables are created in a way that also exports them into any subsequent programs which are run in the same shell, i.e. they become environment variables. However, envdot can also create them as simple variables only for the current process.

Examples of usage:

eval `envdot --no-export --shell csh`
eval `envdot --dotenv subdir/.env`
ENVDOT_FILEPATHS='../.env:subdir/.env:.env' eval `envdot`

Using Env::Dot in a Program

Env::Dot is a Perl language module. If used in code, having the .env file is not mandatory. By default, Env::Dot will do nothing if there is no .env file. You can also configure *Env::Dot * to break execution if there is no .env file.

# If your dotenv file is `.env` or there is no `.env` file:
use Env::Dot;

# If you have a dotenv file in a different filepath:
use Env::Dot read => {
    dotenv_file => '/other/path/my_environment.env',
};

# When you absolutely require a `.env` file:
use Env::Dot read => {
    required => 1,
};

Existing environment variables always take precedence to dotenv variables. A dotenv variable (variable from a file) does not overwrite an existing environment variable. This is by design because
a dotenv file is to augment the environment, not to replace it. This means that you can override a variable in .env file by creating its counterpart in the environment.

An example of how that works in a normal shell:

#!/usr/bin/env sh
unset VAR
echo "VAR='Good value'" >> .env
perl -e 'use Env::Dot; print "VAR:$ENV{VAR}\n";'
# VAR:Good value
VAR='Better value'; export VAR
perl -e 'use Env::Dot; print "VAR:$ENV{VAR}\n";'
# VAR:Better value

If your .env file(s) contain variables which need interpolating,
for example, to combine their value from other variables or execute a command to produce their value, you have to use the envdot program. Env::Dot does not do any interpolating. It cannot because that would involve running the variable in the shell context within the calling program.

Env::Assert And Env::Dot

If you are in the habit of using .env files, .envdesc complements it. Commit your .envdesc file into your repository and it will act as a template for user or developer to create his/her .env file which should not be committed into Git anyway.

Software developed in Perl, using Catalyst DBIx JS SQL
Version Control SVN

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications in Perl with a local development…

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications in Perl with a local development environment.

Why not just use Python?

Python has traditionally been the main coding language for ML and AI tools. One of the strengths of the MCP protocol is that the actual implementation details are independent of the development language. The reality is that not every project is coded in Python- and MCP allows you to use the latest AI approaches with other coding languages.

Perl? Is that even a language anymore?

The goal of this article is to provide a minimal viable basic working MCP stdio server in Perl that can be run locally without any unneeded extra code or extensions.

The Perl MCP module is here:

MCP-0.06

What Is Perl?

Perl is a general-purpose, high-level programming and scripting language, primarily known for its powerful text manipulation capabilities. Originally created by Larry Wall in 1987 for easier report processing, it has evolved to be used for a wide range of tasks, including system administration, web development, and network programming.

The main site for Perl is :

The Perl Programming Language - www.perl.org

Installing Perl

The step by step instructions vary by platform- for a basic Debian system here are the steps:

sudo apt-get install perl cpanminus

xbill@penguin:~/gemini-cli-codeassist/mcp-stdio-perl$ perl --version

This is perl 5, version 36, subversion 0 (v5.36.0) built for x86_64-linux-gnu-thread-multi
(with 60 registered patches, see perl -V for more detail)

Gemini CLI

If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:

npm install -g @google/gemini-cli

Testing the Gemini CLI Environment

Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:

gemini

Node Version Management

Gemini CLI needs a consistent, up to date version of Node. The nvm command can be used to get a standard Node environment:

GitHub - nvm-sh/nvm: Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions

Perl MCP Documentation

The official MCP CPAN page provides samples and documentation for getting started:

MCP

Where do I start?

The strategy for starting MCP development is a incremental step by step approach.

First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.

Then, a minimal Hello World Style Perl MCP Server is built with stdio transport. This server is validated with Gemini CLI in the local environment.

This setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the MCP server both run in the same local environment.

Next- the basic MCP server is extended with Gemini CLI to add several new tools in standard code.

Setup the Basic Environment

At this point you should have a working Perl environment and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:

cd ~
git clone https://github.com/xbill9/gemini-cli-codeassist

Then run init.sh from the cloned directory.

The script will attempt to determine your shell environment and set the correct variables:

cd gemini-cli-codeassist
source init.sh

If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:

cd gemini-cli-codeassist
source set_env.sh

Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.

Hello World with STDIO Transport

One of the key features that the standard MCP libraries provide is abstracting various transport methods.

The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.

The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.

The connection over stdio will look similar to this:

# Explicitly use stdio transport
$server->to_stdio;

Perl Package Information

The code depends on several standard Perl libraries for MCP and logging:

requires 'Mojolicious::Lite';
requires 'MCP::Server';
requires 'JSON::MaybeXS';
requires 'WWW::Google::Cloud::Auth::ServiceAccount';
requires 'URI::Encode';
requires 'LWP::Protocol::https';

on 'develop' => sub {
    requires 'Perl', '5.010';
    requires 'Perl::Critic';
    requires 'Perl::Tidy';
};

Installing and Running the Perl Code

Run the install make release target on the local system:

xbill@penguin:~/gemini-cli-codeassist/mcp-stdio-perl$ make
Installing dependencies...
--> Working on .
Configuring /home/xbill/gemini-cli-codeassist/mcp-stdio-perl ... OK
<== Installed dependencies for .. Finishing.
Running tests...
t/00_compile.t .. ok   

To test the code:

xbill@penguin:~/gemini-cli-codeassist/mcp-stdio-perl$ make test
Running tests...
t/00_compile.t .. ok   
All tests successful.
Files=1, Tests=1, 0 wallclock secs ( 0.01 usr 0.00 sys + 0.17 cusr 0.05 csys = 0.23 CPU)
Result: PASS

Gemini CLI settings.json

In this example — the Perl source code uses a Perl interpretor that can be called directly from Gemini CLI.

The default Gemini CLI settings.json has an entry for the source:

{
  "mcpServers": {
    "hello-stdio-perl": {
      "command": "perl",
      "args": [
        "-I$HOME/gemini-cli-codeassist/mcp-stdio-perl/local/lib/perl5",
        "$HOME/gemini-cli-codeassist/mcp-stdio-perl/server.pl"
      ]
    }
  }
}

Validation with Gemini CLI

Finally- Gemini CLI is restarted and the MCP connection over stdio to the Perl Code is validated, The full Gemini CLI Session will start:

> /mcp list

🟢 hello-stdio-perl - Ready (1 tool)
  Tools:
  - greet

> greet Camel

✦ I will call the greet tool with the parameter Camel.

╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ? greet (hello-stdio-perl MCP Server) {"param":"Camel"} ← │
│ │
│ MCP Server: hello-stdio-perl │
│ Tool: greet │
│ │
│ Allow execution of MCP tool "greet" from server "hello-stdio-perl"? │
│ │
│ 1. Allow once │
│ 2. Allow tool for this session │
│ ● 3. Allow all server tools for this session │
│ 4. No, suggest changes (esc) │
│ │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (hello-stdio-perl MCP Server) {"param":"Camel"} │
│ │
│ Camel │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ Hello Camel! How can I help you today?

Extending the MCP Server with Gemini CLI

Now that the basic MCP functionality has been tested and verified with Gemini CLI- the Perl server code can be extended with new MCP tools:

> with the perl project add new MCP tools - 
one that gets detailled Perl system information, 
one that gets server information
and one that gets the current time

Gemini CLI will get right to work:

│ 53 │
│ 54 + $server->tool( │
│ 55 + name => 'get_perl_system_info', │
│ 56 + description => 'Get detailed Perl system information (Config::myconfig).', │
│ 57 + input_schema => { │
│ 58 + type => 'object', │
│ 59 + properties => {}, │
│ 60 + }, │
│ 61 + code => sub ( $tool, $args ) { │
│ 62 + $log->info("Executed get_perl_system_info tool"); │
│ 63 + return Config::myconfig(); │
│ 64 + } │
│ 65 + ); │
│ 66 + │ │

Once completed — the changes will be summarized:

✦ I have added the three requested MCP tools to server.pl and verified them with a new test suite.

  Summary of changes:
   1. Modified `server.pl`:
       * Imported Config module.
       * Added get_perl_system_info: Returns detailed Perl configuration using Config::myconfig().
       * Added get_server_info: Returns server name, version, status, and transport type.
       * Added get_current_time: Returns the current system time.
       * Refactored the script to be testable (returns $server and only runs to_stdio if run as a script).
   2. Created `t/01_tools.t`:
       * A new test file that loads server.pl, inspects the registered tools, and executes their callbacks to verify correctness.

  Verification:
   * t/00_compile.t passed.
   * t/01_tools.t passed, confirming all 3 tools are registered and return expected data.

Restart Gemini CLI and check for the new MCP tools:

xbill@penguin:~/gemini-cli-codeassist/mcp-stdio-perl$ make
Installing dependencies...
--> Working on .
Configuring /home/xbill/gemini-cli-codeassist/mcp-stdio-perl ... OK
<== Installed dependencies for .. Finishing.
Running tests...
t/00_compile.t .. ok   
t/01_tools.t .... 1/? {"asctime":"2026-01-12T01:08:56.49922Z","levelname":"INFO","message":"Executed get_server_info tool","name":"root"}
{"asctime":"2026-01-12T01:08:56.49954Z","levelname":"INFO","message":"Executed get_current_time tool","name":"root"}
{"asctime":"2026-01-12T01:08:56.49965Z","levelname":"INFO","message":"Executed get_perl_system_info tool","name":"root"}
t/01_tools.t .... ok    
All tests successful.
Files=2, Tests=13, 0 wallclock secs ( 0.01 usr 0.01 sys + 0.35 cusr 0.05 csys = 0.42 CPU)
Result: PASS

And list the new MCP Tools:

🟢 hello-stdio-perl - Ready (4 tools)
  Tools:
  - get_current_time
  - get_perl_system_info
  - get_server_info
  - greet

to run the tools:

> get_current_time

✦ I will get the current time for you.
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ? get_current_time (hello-stdio-perl MCP Server) {} ← │
│ │
│ MCP Server: hello-stdio-perl │
│ Tool: get_current_time │
│ │
│ Allow execution of MCP tool "get_current_time" from server "hello-stdio-perl"? │
│ │
│ 1. Allow once │
│ 2. Allow tool for this session │
│ ● 3. Allow all server tools for this session │
│ 4. No, suggest changes (esc) │
│ │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ I will get the current time for you.
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ get_current_time (hello-stdio-perl MCP Server) {} │
│ │
│ Sun Jan 11 20:12:02 2026 │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ I've retrieved the current time. I am now ready for your first command.

and Perl information:

> get_perl_system_info

✦ I will retrieve the Perl system information for you.
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ get_perl_system_info (hello-stdio-perl MCP Server) {} │
│ │
│ Summary of my perl5 (revision 5 version 36 subversion 0) configuration: │
│ │
│ Platform: │
│ osname=linux │
│ osvers=4.19.0 │
│ archname=x86_64-linux-gnu-thread-multi │
│ uname='linux localhost 4.19.0 #1 smp debian 4.19.0 x86_64 gnulinux ' │

and system information:

> get_server_info

✦ I will get the information about this MCP server.

╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ get_server_info (hello-stdio-perl MCP Server) {} │
│ │
│ {"name":"Perl MCP Stdio Server","perl_version":"v5.36.0","status":"running","transport":"stdio","version":"1.0.0"} │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The server is running version 1.0.0 of the Perl MCP Stdio Server on Perl v5.36.0.

Summary

The strategy for using Perl with MCP development with Gemini CLI was validated with a incremental step by step approach.

A minimal stdio transport MCP Server was started from Perl source code and validated with Gemini CLI running as a MCP client in the same local environment.

Gemini CLI was then used to extend the sample Perl code with several MCP tools and use these tools inside the context for the underlying LLM.

Welcome to the Week #356 of The Weekly Challenge.
Thank you Team PWC for your continuous support and encouragement.
As you know, The Weekly Challenge, primarily focus on Perl and Raku. During the Week #018, we received solutions to The Weekly Challenge - 018 by Orestis Zekai in Python. It was pleasant surprise to receive solutions in something other than Perl and Raku. Ever since regular team members also started contributing in other languages like Ada, APL, Awk, BASIC, Bash, Bc, Befunge-93, Bourne Shell, BQN, Brainfuck, C3, C, CESIL, Chef, COBOL, Coconut, C Shell, C++, Clojure, Crystal, CUDA, D, Dart, Dc, Elixir, Elm, Emacs Lisp, Erlang, Excel VBA, F#, Factor, Fennel, Fish, Forth, Fortran, Gembase, Gleam, GNAT, Go, GP, Groovy, Haskell, Haxe, HTML, Hy, Idris, IO, J, Janet, Java, JavaScript, Julia, K, Kap, Korn Shell, Kotlin, Lisp, Logo, Lua, M4, Maxima, Miranda, Modula 3, MMIX, Mumps, Myrddin, Nelua, Nim, Nix, Node.js, Nuweb, Oberon, Octave, OCaml, Odin, Ook, Pascal, PHP, PicoLisp, Python, PostgreSQL, Postscript, PowerShell, Prolog, R, Racket, Rexx, Ring, Roc, Ruby, Rust, Scala, Scheme, Sed, Smalltalk, SQL, Standard ML, SVG, Swift, Tcl, TypeScript, Typst, Uiua, V, Visual BASIC, WebAssembly, Wolfram, XSLT, YaBasic and Zig.
Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::Greple - extensible grep with lexical expression and region handling
    • Version: 10.02 on 2026-01-09, with 56 votes
    • Previous CPAN version: 10.01 was 9 days before
    • Author: UTASHIRO
  2. App::Netdisco - An open source web-based network management tool.
    • Version: 2.097002 on 2026-01-09, with 818 votes
    • Previous CPAN version: 2.097001 
    • Author: OLIVER
  3. App::Sqitch - Sensible database change management
    • Version: v1.6.1 on 2026-01-06, with 3087 votes
    • Previous CPAN version: v1.6.0 was 3 months before
    • Author: DWHEELER
  4. CPANSA::DB - the CPAN Security Advisory data as a Perl data structure, mostly for CPAN::Audit
    • Version: 20260104.001 on 2026-01-04, with 25 votes
    • Previous CPAN version: 20251228.001 was 6 days before
    • Author: BRIANDFOY
  5. DateTime::Format::Natural - Parse informal natural language date/time strings
    • Version: 1.23 on 2026-01-04, with 19 votes
    • Previous CPAN version: 1.23 was 5 days before
    • Author: SCHUBIGER
  6. Firefox::Marionette - Automate the Firefox browser with the Marionette protocol
    • Version: 1.69 on 2026-01-10, with 19 votes
    • Previous CPAN version: 1.68 was 3 months, 26 days before
    • Author: DDICK
  7. GD - Perl interface to the libgd graphics library
    • Version: 2.84 on 2026-01-04, with 32 votes
    • Previous CPAN version: 2.83 was 1 year, 6 months, 11 days before
    • Author: RURBAN
  8. IO::Socket::SSL - Nearly transparent SSL encapsulation for IO::Socket::INET.
    • Version: 2.098 on 2026-01-06, with 49 votes
    • Previous CPAN version: 2.097 
    • Author: SULLR
  9. JSON::Schema::Modern - Validate data against a schema using a JSON Schema
    • Version: 0.632 on 2026-01-06, with 16 votes
    • Previous CPAN version: 0.631 was 12 days before
    • Author: ETHER
  10. MetaCPAN::Client - A comprehensive, DWIM-featured client to the MetaCPAN API
    • Version: 2.037000 on 2026-01-07, with 27 votes
    • Previous CPAN version: 2.036000 
    • Author: MICKEY
  11. MIME::Lite - low-calorie MIME generator
    • Version: 3.035 on 2026-01-08, with 35 votes
    • Previous CPAN version: 3.034 was 2 days before
    • Author: RJBS
  12. Module::Starter - a simple starter kit for any module
    • Version: 1.81 on 2026-01-09, with 34 votes
    • Previous CPAN version: 1.80 
    • Author: XSAWYERX
  13. Perl::Tidy - indent and reformat perl scripts
    • Version: 20260109 on 2026-01-08, with 147 votes
    • Previous CPAN version: 20250912 was 3 months, 26 days before
    • Author: SHANCOCK
  14. perlsecret - Perl secret operators and constants
    • Version: 1.018 on 2026-01-09, with 55 votes
    • Previous CPAN version: 1.017 was 4 years, 2 months before
    • Author: BOOK
  15. Type::Tiny - tiny, yet Moo(se)-compatible type constraint
    • Version: 2.010001 on 2026-01-06, with 148 votes
    • Previous CPAN version: 2.010000 was 7 days before
    • Author: TOBYINK
  16. UV - Perl interface to libuv
    • Version: 2.001 on 2026-01-06, with 14 votes
    • Previous CPAN version: 2.000 was 4 years, 5 months, 8 days before
    • Author: PEVANS
Thank you Team PWC for your continuous support and encouragement.
Thank you Team PWC for your continuous support and encouragement.

In a script I'm using constants (use constant ...) to allow re-use ion actual regular expressions, using the pattern from https://stackoverflow.com/a/69379743/6607497. However when using a {...} repeat specifier following such constant expansion, Perl wants to tread the constant as a hash variable.

The question is how to avoid that.

Code example:

main::(-e:1):   1
  DB<1> use constant CHARSET => '[[:graph:]]'

  DB<2> x "foo" =~ qr/^[[:graph:]]{3,}$/
0  1
  DB<3> x "foo" =~ qr/^${\CHARSET}{3,}$/
Not a HASH reference at (eval 8)[/usr/lib/perl5/5.26.1/perl5db.pl:738] line 2.
  DB<4> x "foo" =~ qr/^${\CHARSET}\{3,}$/
  empty array
  DB<5> x $^V
0  v5.26.1

According to https://stackoverflow.com/a/79845011/6607497 a solution may be to add a space that's being ignored, like this: qr/^${\CHARSET} {3,}$/x; however I don't understand why this works, because outside of a regular expression the space before { is being ignored:

  DB<6> x "foo" =~ qr/^${\CHARSET} {3,}$/x
0  1
  DB<7> %h = (a => 3)

  DB<8> x $h{a}
0  3
  DB<9> x $h {a}
0  3

The manual page (perlop(1) on "Quote and Quote-like Operators") isn't very precise on that:

For constructs that do interpolate, variables beginning with "$" or "@" are interpolated. Subscripted variables such as $a[3] or "$href->{key}[0]" are also interpolated, as are array and hash slices. But method calls such as "$obj->meth" are not.

foobar is a Perl script that prints to both standard output and standard error. In a separate Perl script echo-stderr, I run foobar and capture its standard error using IPC::Open3's open3 function, and simply echo it back.

Here's the code for echo-stderr:

#!/usr/bin/perl -w

use IPC::Open3;
use Symbol 'gensym';

$fh = gensym;
$pid = open3('STDIN', 'STDOUT', $fh, './foobar') or die "$0: failed to run ./foobar\n";

while ( <$fh> ) {
    print STDERR $_;
}

close $fh;
waitpid($pid, 0);

The result is that whatever foobar writes to standard error is printed, nothing that it writes to standard output is. And there is an error at the end:

<message written to STDERR>
<message written to STDERR>
...
Unable to flush stdout: Bad file descriptor

What is the reason for this error?

Software developed in Perl, using Catalyst DBIx JS SQL
Version Control SVN

App::HTTPThis: the tiny web server I keep reaching for

Perl Hacks

Whenever I’m building a static website, I almost never start by reaching for Apache, nginx, Docker, or anything that feels like “proper infrastructure”. Nine times out of ten I just want a directory served over HTTP so I can click around, test routes, check assets, and see what happens in a real browser.

For that job, I’ve been using App::HTTPThis for years.

It’s a simple local web server you run from the command line. Point it at a directory, and it serves it. That’s it. No vhosts. No config bureaucracy. No “why is this module not enabled”. Just: run a command and you’ve got a website.

Why I’ve used it for years

Static sites are deceptively simple… right up until they aren’t.

  • You want to check that relative links behave the way you think they do.

  • You want to confirm your CSS and images are loading with the paths you expect.

  • You want to reproduce “real HTTP” behaviour (caching headers, MIME types, directory handling) rather than viewing files directly from disk.

Sure, you can open file:///.../index.html in a browser, but that’s not the same thing as serving it over HTTP. And setting up Apache (or friends) feels like bringing a cement mixer to butter some toast.

With http_this, the workflow is basically:

  • cd into your site directory

  • run a single command

  • open a URL

  • get on with your life

It’s the “tiny screwdriver” that’s always on my desk.

Why I took it over

A couple of years ago, the original maintainer had (entirely reasonably!) become too busy elsewhere and the distribution wasn’t getting attention. That happens. Open source is like that.

But I was using App::HTTPThis regularly, and I had one small-but-annoying itch: when you visited a directory URL, it would always show a directory listing – even if that directory contained an index.html. So instead of behaving like a typical web server (serve index.html by default), it treated index.html as just another file you had to click.

That’s exactly the sort of thing you notice when you’re using a tool every day, and it was irritating enough that I volunteered to take over maintenance.

(If you want to read more on this story, I wrote a couple of blog posts.)

What I’ve done since taking it over

Most of the changes are about making the “serve a directory” experience smoother, without turning it into a kitchen-sink web server.

1) Serve index pages by default (autoindex)

The first change was to make directory URLs behave like you’d expect: if index.html exists, serve it automatically. If it doesn’t, you still get a directory listing.

2) Prettier index pages

Once autoindex was in place, I then turned my attention to the fallback directory listing page. If there isn’t an index.html, you still need a useful listing — but it doesn’t have to look like it fell out of 1998. So I cleaned up the listing output and made it a bit nicer to read when you do end up browsing raw directories.

3) A config file

Once you’ve used a tool for a while, you start to realise you run it the same way most of the time.

A config file lets you keep your common preferences in one place instead of re-typing options. It keeps the “one command” feel, but gives you repeatability when you want it.

4) --host option

The ability to control the host binding sounds like an edge case until it isn’t.

Sometimes you want:

  • only localhost access for safety;

  • access from other devices on your network (phone/tablet testing);

  • behaviour that matches a particular environment.

A --host option gives you that control without adding complexity to the default case.

The Bonjour feature (and what it’s for)

This is the part I only really appreciated recently: App::HTTPThis can advertise itself on your local network using mDNS / DNS-SD – commonly called Bonjour on Apple platforms, Avahi on Linux, and various other names depending on who you’re talking to.

It’s switched on with the --name option.

http_this --name MyService

When you do that, http_this publishes an _http._tcp service on your local network with the instance name you chose (MyService in this case). Any device on the same network that understands mDNS/DNS-SD can then discover it and resolve it to an address and port, without you having to tell anyone, “go to http://192.168.1.23:7007/”.

Confession time: I ignored this feature for ages because I’d mentally filed it under “Apple-only magic” (Bonjour! very shiny! probably proprietary!). It turns out it’s not Apple-only at all; it’s a set of standard networking technologies that are supported on pretty much everything, just under a frankly ridiculous number of different names. So: not Apple magic, just local-network service discovery with a branding problem.

Because I’d never really used it, I finally sat down and tested it properly after someone emailed me about it last week, and it worked nicely, nicely enough that I’ve now added a BONJOUR.md file to the repo with a practical explanation of what’s going on, how to enable it, and a few ways to browse/discover the advertised service.

(If you’re curious, look for _http._tcp and your chosen service name.)

It’s a neat quality-of-life feature if you’re doing cross-device testing or helping someone else on the same network reach what you’re running.

Related tools in the same family

App::HTTPThis is part of a little ecosystem of “run a thing here quickly” command-line apps. If you like the shape of http_this, you might also want to look at these siblings:

  • https_this : like http_this, but served over HTTPS (useful when you need to test secure contexts, service workers, APIs that require HTTPS, etc.)

  • cgi_this : for quick CGI-style testing without setting up a full web server stack

  • dav_this : serves content over WebDAV (handy for testing clients or workflows that expect DAV)

  • ftp_this : quick FTP server for those rare-but-real moments when you need one

They all share the same basic philosophy: remove the friction between “I have a directory” and “I want to interact with it like a service”.

Wrapping up

I like tools that do one job, do it well, and get out of the way. App::HTTPThis has been that tool for me for years and it’s been fun (and useful) to nudge it forward as a maintainer.

If you’re doing any kind of static site work — docs sites, little prototypes, generated output, local previews — it’s worth keeping in your toolbox.

And if you’ve got ideas, bug reports, or platform notes (especially around Bonjour/Avahi weirdness), I’m always happy to hear them.

The post App::HTTPThis: the tiny web server I keep reaching for first appeared on Perl Hacks.

A jq-compatible JSON processor written in pure Perl, designed for environments where jq cannot be installed.

(dlxxxi) 8 great CPAN modules released last week

Niceperl
Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::cpm - a fast CPAN module installer
    • Version: 0.998003 on 2025-12-29, with 177 votes
    • Previous CPAN version: 0.998002 was 24 days before
    • Author: SKAJI
  2. App::Greple - extensible grep with lexical expression and region handling
    • Version: 10.01 on 2025-12-31, with 56 votes
    • Previous CPAN version: 10.00 was 19 days before
    • Author: UTASHIRO
  3. App::Music::ChordPro - A lyrics and chords formatting program
    • Version: v6.090.1 on 2026-01-03, with 432 votes
    • Previous CPAN version: v6.090.0 was 2 months, 3 days before
    • Author: JV
  4. CPANSA::DB - the CPAN Security Advisory data as a Perl data structure, mostly for CPAN::Audit
    • Version: 20251228.001 on 2025-12-29, with 25 votes
    • Previous CPAN version: 20251221.001 was 7 days before
    • Author: BRIANDFOY
  5. DBD::SQLite - Self Contained SQLite RDBMS in a DBI Driver
    • Version: 1.78 on 2026-01-02, with 107 votes
    • Previous CPAN version: 1.76 was 1 year, 2 months, 14 days before
    • Author: ISHIGAKI
  6. Module::Starter - a simple starter kit for any module
    • Version: 1.79 on 2026-01-03, with 33 votes
    • Previous CPAN version: 1.78 was 7 months, 30 days before
    • Author: XSAWYERX
  7. Type::Tiny - tiny, yet Moo(se)-compatible type constraint
    • Version: 2.010000 on 2025-12-30, with 148 votes
    • Previous CPAN version: 2.009_003 was 7 days before
    • Author: TOBYINK
  8. WebService::Dropbox - Perl interface to Dropbox API
    • Version: 2.10 on 2025-12-29, with 12 votes
    • Previous CPAN version: 2.09 was 4 years, 6 months, 14 days before
    • Author: ASKADNA

Horror Movie Month 2024

rjbs forgot what he was saying

Yesterday, I posted about the books I read in 2025, which made me remember that I never posted about the (horror) movies we watched in October 2024. So, I thought I’d get around to that. Of course this will be short and lossy, right? It’s been over a year.

Here’s what we watched for Horror Movie Month in 2024, at least according to my notes!

October 1: Raw (2016)

Girl goes to college, finally lets loose by becoming a cannibal. This movie was French and you’d know it even if you watched it dubbed. It was okay. It was worth my time.

October 2: Tragedy Girls (2017)

Two high school girls who are interested in death try to make more of it happen. It was a horror-comedy, and it was fun. Brianna Hildebrand, who you may remember as Negasonic Teenage Warhead, was in it.

October 4: V/H/S/Beyond (2024)

Honestly, apart from the 2025 one, most of the V/H/S movies are about the same to me: mixed bags, but usually not quite worth the whole watch. This one was that too. It had its moments.

October 5: Humanist Vampire Seeking Consenting Suicidal Person (2023)

Honestly, I’d watch just for “Want to see a French-Canadian horror movie?”

A young woman in a family of vampires really doesn’t want to go hunt for blood, but her parents have reluctantly become insistent. She decides she’ll look for somebody who’d be willing to donate.

It was good, and sort of a horror-comedy. It didn’t feel like every other movie, which was good.

October 6: Onyx the Fortuitous and the Talisman of Souls (2023)

I liked this the least of everybody in my household, I think. It was sometimes pretty funny, but the main character got on my nerves. I got the impression he is a YouTube character with some following, maybe? Like Ernest P. Worrell or other over the top “originally in small doses” characters, he was just too much here.

That said, we still make references to the guy’s catch phrase, so it stuck with us.

October 6: Cuckoo (2024)

This was one of the big hits of “general horror movies of 2024”, so I was glad we got to watch it. I liked it! It wasn’t perfect, but it did well at being one of those “Why can’t everybody else see how messed up this lovely place really is?” movies.

October 7: Let the Wrong One In (2021)

This movie was really stupid and I liked it. First off, there was a character named Deco, which made me think of The Commitments, which won points. Also, Anthony Stewart Head.

Basically it’s sort of a slapstick farcical vampire movie set in Ireland. Honestly, “What if [some successful movie] but the protagonists were idiots?” is a pretty good formula.

October 8: The Witches of Eastwick (1987)

Still a classic.

Sure, it’s kind of a mess here and there, but it’s got a great cast and it just goes for it. I read recently that there was talk about casting other people (other than Jack Nicholson) as Daryl Van Horne, which seems like it could only have been worse. One name mentioned was Bill Murray. What?! This was a nearly perfect vehicle for Jack Nicholson doing comedy, and Cher, Susan Sarandon, and Michelle Pfeiffer were a lot of fun, too.

The cherry scene!

October 9: Courtney Gets Possessed (2023)

I barely remember this one. I think it was funny enough? Demonic hijinks at a bachelorette party.

October 10: There’s Something Wrong with the Children (2023)

Two parents, their two kids, and an adult friend take a camping trip. The kids wander off in the woods and when they come back, they are… off. Things keep getting worse.

This was good. It wasn’t great, but it was good. You want to yell, “Wake up, people, your kids are busted!”

October 12: 6:45 (2021)

It took me a while to remember this one. It was okay. A couple take a ill-advised holiday to an island town, which leads to a deadly time loop. It was okay, but there are many better movies to watch instead. (Look, maybe it’s better than I remember, but given I barely remember it…)

October 13: Oddity (2024)

I didn’t remember this until reading the synopsis, but it was quite good. So maybe my “it’s bad because I don’t remember it” take above is wrong!

A woman is murdered at her secluded fixer-upper in the countryside. Later, her twin sister shows up and is really weird. What’s going on? You should just watch it, probably. Not a comedy.

October 14: Mr. Crocket (2024)

This is sort of like “what if there was a haunted video tape that showed you a cutesy TV show for kids, but also it was evil?” I wanted to like it, but it was just ugly. It wasn’t fun or funny, just dark. It wasn’t darkly funny, although maybe that was the goal.

October 15: Evil Dead Ⅱ (1987)

I think we watched this because Marty hadn’t seen it. Look, it’s fine. It’s a lot better than the first version. I think it’s just not exactly my bag. (I really like Bruce Campbell, though!)

October 16: Cube (2021)

I really liked Cube! This is not that movie, though, it’s a 2021 remake from Japan. Don’t bother. It is worse in every way. Maybe it’s okay, but it’s not significantly different, so go with the original.

October 18: Zombie Town (2023)

A reclusive movie director releases one more movie, and it turns everybody in town into zombies. Kids fight back.

This kind of movie could’ve been fun, but it wasn’t. It had two of the Kids in the Hall in it! What a waste.

October 19: The Finale (2023)

Oh yeah, this one.

Murders start happening at a summer theater camp. Everybody has a motive. Who did it?

Well, look, I think this was maybe better than the related Stage Fright, but it was bad. It was way too long. It was sometimes nonsensical. I do not recommend it.

October 19: Invitation to Hell (1984)

This gets huge points from me for “picked a weird premise and didn’t back down.” Wes Craven directs. A family moves to a new planned town where the father has taken a great new job. Everybody is obsessed with the local country club and its manager. Like, weirdly obsessed. What the heck is going on in town? Also, Robert Urich and Susan Lucci? Wild.

Not great, but I am glad I watched it.

October 20: Corporate Animals (2019)

A bunch of coworkers on a team-building exercise end up trapped in a cave. Demi Moore?! We had fun. It was stupid in a good way. The company specialized in edible cutlery, which paid off a few ways.

October 20: Stranger in Our House (1978)

Wes Craven again, this time with Linda Blair. It wasn’t great, sadly, and the concept has been done a bunch of times. Orphaned kid moves in with other family, and only one family member realizes that maybe this is a bad idea. It was… fine.

October 24: Little Evil (2017)

Adam Scott becomes the step-dad to the Antichrist and really tries to make things work. This was not amazing, but it was much better than I expected. I don’t mind having watched it, but I wouldn’t watch it again.

Good job casting the really creepy kid, though!

October 25: Deer Camp ‘86 (2022)

A bunch of guys go hunting and get into trouble. I remember nothing.

October 26: The Day of the Beast (1995)

A priest figures out how to predict the exact birth of the Antichrist, and enlists the help of a headbanger and a TV occultist to save the world. Was this a comedy on purpose? I just don’t know. It was weird, and unpredictable, and so I liked it.

October 27: The Strangers (2008)

What a lousy movie to end on. It’s a boring, tedious home invasion movie. I see it was 86 minutes long, but I remember it feeling much longer. Also, I think they remade it into a three part movie? I can’t imagine.

I just didn’t care about anyone or anything in this movie.

the books I read in 2025

rjbs forgot what he was saying

I don’t take the Goodreads “reading challenge” too seriously, but I did hit my target last year, and it felt good. I thought I’d try again this year and I did get it done – only just, though, as I finished my last three books in the last two days of the year. I think I would’ve liked to read a bit more through the year, but sometimes I just wasn’t feeling it. So it goes! I think this is a “structure your time” problem, but also it’s not the most pressing thing on my agenda, you know?

So, here’s what I read, not in order, and some brief notes.

Greg Egan

Last year, I read five Greg Egan books. This year, just two. First, I read The Book of All Skies, which I enjoyed. It’s the story of a group of people investigating the frontiers of their very weirdly-shaped world. As with many Egan books, there’s a lot of very weird math and physics under the hood, but it wasn’t critical to think too hard about them, and I think that made the story more enjoyable for me. In this book, they would’ve gotten in the way. That said, when I finished the book I went and read a bunch of Egan’s notes on the underlying ideas, which were interesting (insofar as I understood them).

Later, I read Schild’s Ladder, which was roughly the opposite. That is, it was one of the most physics-heavy Egan books I’ve read. More than once, I wanted to take photos of the page because it was a wall of thick jargon. I did not enjoy the book. At the beginning, I said, “Oh, this is going to be Egan’s take on Cat’s Cradle!” That would’ve been very interesting, especially because Egan and Vonnegut are so, so different. Or: maybe it was that, but I didn’t care to think about the comparison by the end. It reminded me of Vinge, too, but not in a way that excited me. Anyway, look, I’ve read a lot of Egan, and I will read more. This just didn’t hit home.

Effectiveness

“Effectiveness” is my shelf (or label or tag or whatever they call it now) in Goodreads for books on productivity and management. I have a lot of books in that queue, but I only make slow progress, for many reasons.

My favorite of the ones I read this year, by a long way, was Radical Candor. This is one of those books that I’d read about many times. It sounded not bad, but not amazing. But, of course, I’d only been seeing the shadows on the wall. It was great, and I hope I will go back to it in the future to try to puzzle out more ways to do better at my job. It really resonated with me, and I’ve brought it up over and over when talking to other managers, this year.

I also read Laziness Does Not Exist, which I didn’t love. It was okay. I feel the author would probably just give me a knowing “don’t you hear yourself??” look, but I kept wanting to say, “Yes, don’t work yourself sick, but honestly you are going too far.” I think the issue is that an indictment of a society-wide problem requires a massive-scaled critique. But “the Laziness Lie has you in its grip!”, over and over, was too much for me. (It was also funny that I finished this book just today, December 31st, and it had text saying “Don’t get worked up trying to meet your Goodreads goals”!)

Finally, as I wanted to get a bit more handle on some of my team’s rituals, I read Liftoff: Start and Sustain Agile Teams. I found it totally unremarkable, so I have no remarks.

Boss Fight Books

Boos Fight Books publishes short books about influential or otherwise important video games. The books are written by people who found the books to be important to them.

The first one I read was Animal Crossing by Kelsey Lewin. I’ve played all the main Animal Crossing games and have enjoyed them all. (Well, no, the iOS one was awful.) This book, at a pleasing 2⁸ pages, talked about the origin of the game, its weird launch history starting with the Nintendo64 DD, how it changed over time, and how the author enjoyed it (or didn’t) over time. I enjoyed the book, and felt like I’d read more like this – but it was also clear that a lot of the book was about the author’s life, which wasn’t really what I wanted. So, it wasn’t a bad book, it just wasn’t exactly what I wanted.

PaRappa the Rapper and ZZT books, which were similarly a mix of “I am very interested!” and “I am not particularly interested”. I knew what I was getting into, though, so I had no complaint for the authors. I just sort of wish there were more books about these games, focused more exclusively on the history and technology behind them.

I was surprised by how few of my peers remembered ZZT. I remember it being both impressive and influential. I was also surprised to learn how programmable its world builder was, and that ZZT (the game)’s author was that Tim Sweeney. (The book’s author was Anna Anthropy, which was one of the reasons I wanted to read this book.)

Finally, I read the book on Spelunky. I almost didn’t, but then I saw that the author was Derek Yu, also the primary creator of Spelunky itself! This book was by far closest to what I’d want from these books, if I was in charge. I got a copy for my nephews, too, who I introduced to the game a few years ago.

Stephen King

I read three Stephen King books this year, all story collections. I’ve been trying to catch up on reading all his story collections, and I’m very nearly done, now.

First, Four Past Midnight, from 1990. It contains four novellas, all of which I liked okay. I read it in part because I’d been doing some idle research into King’s repeated setting of Castle Rock, and saw that The Sun Dog (a story in this collection) was in some ways tied up with Needful Things.

After that, I read Hearts in Atlantis. This was a frustrating experience, because I kept thinking that maybe I’d read it already, twenty years ago, but I couldn’t be sure. This was extra frustrating because it seemed to me like one of King’s best books. Structurally and textually, it was excellent. I would recommend this to somebody who wasn’t sure they wanted to read Stephen King.

Finally, You Like It Darker. This is a collection of short stories published just last year. It was good! I enjoyed just about all of it, maybe most especially the final three stories. One of these was a sequel to Cujo, which I definitely did not expect to be reading!

Technical Books

This year, I’ve become the full-time lead of Fastmail’s Cyrus team. A big part of my team’s work is maintaining the open source Cyrus IMAP server. It’s written in C. My C is miserable, and was probably at its best in 1992. I need to get better. I read two C books this year: Effective C and Understanding and Using C Pointers. I think both were fine, but it’s hard to say. I’m not writing much C, day to day, so probably some of what I learned has already faded away. Still, I thought they were both clear and explained a bunch of topics that I hadn’t understood or only barely understood. Hard to judge, but definitely not bad. I can imagine going back to them later, when doing real work.

I already read tmux 3, a book about tmux. I like tmux quite a lot, and this isn’t the first little book I’ve read about it. It’s hard for me to say what I thought of it. I think it was a bit of a mish-mash for me. I was coming to it with a pretty long history with tmux, so lots of things were old hat and not worth my time. But as with many complex tools, even among the fundamentals there were lots of things I didn’t know. Here’s my biggest praise for the book: After I read it, I went back to a few sections I’d flagged and worked through my .tmux.conf, making improvements based on what the book taught me.

Slough House

Okay, so my biggest category of books was the Slough House series by Mick Herron. A full third of the books I read this year were these books.

Here are the titles:

  • Dead Lions
  • Real Tigers
  • Standing by the Wall
  • Spook Street
  • Nobody Walks
  • London Rules
  • Joe Country
  • Slough House
  • Bad Actors
  • The Secret Hours
  • Reconstruction
  • Clown Town

Look, they’re all very good. That’s why I read them! The only notable exception, I think, is Reconstruction. It’s fine. It’s just the least Slough House-y book, really tied in only by one character, and structured very differently from the rest. I’d almost recommend skipping it. It was a bit of a bummer that it was the last one I read for months. The last one I read, Clown Town, was only released this year, and I read it roughly immediately. (Thanks, Gloria, for giving me a surprise copy!)

Other Fiction

I read Thorns by Robert Silverberg, which was a 1967 nominee for the Nebula and Hugo. I can’t remember why I read it. I think it got onto my reminders list ages ago, and then it was on deep discount. I would’ve done better to just not read it. In 1967, it may have been interesting, but it didn’t age well.

I read How Long ‘til Black Future Month? by N.K. Jemisin, whose massively successful Broken Earth series I enjoyed a few years ago. This is a short story collection, and I’m a sucker for a good short story collection. And this was good. I’m told that LeVar Burton read two of these stories on his podcast LeVar Burton Reads, and I look forward to listening to them.

A few years ago, I finally read A Fire Upon the Deep, by Vinge. It was excellent, with a sprawling scope, a complex and coherent setting, and a whole mess of interesting ideas that all slotted together. Mark Dominus told me that the sequel, A Deepness in the Sky, was even better, but “very lonesome”. I agree! Vinge’s ability to spin so many plates, each one interesting on its own, and then to land them all into one neat pile was impressive and satisfying.

I read Ship Breaker and its sequel, The Drowned Cities, by Paolo Bacigalupi. They were okay, but I didn’t bother with the third book. Bacigalupi’s sci-fi work for adults is very good, and I’ve re-read a bunch of it. (I don’t think I re-read Pump 6 in its entirety this year, but I re-read a bunch of it.) The Ship Breaker books are young adult fiction, and all I could see on the page was all the depth and nuance missing compared to his other work. It probably would’ve been better when I was twelve. Given that it’s a book for that audience, who am I to complain?

I read Dungeon Crawler Carl because Bryan was reading it and said it sounded fun. It was fun, but I think too long for me. Everything about it was just a bit much. That could’ve been fun for two short books or so, but it was the first book in a seven book series, with books topping six hundred pages. I tapped out, and will probably read a summary some day.

Finally, I read Virtual Unrealities, a sci-fi story collection by the great Alfred Bester. I think I picked it up because I wanted to read Fondly Farenheit, which was good. I read it in the first week of January, so it’s been a while and I don’t remember it terribly well. My recollection was that I thought it was okay, but on the whole not anywhere near as good as The Demolished Man or The Stars My Destination. That’s the problem with writing massive, incredible successes, I guess!

Other Nonfiction

The Society of the Spectacle is the longest 150 page book I’ve ever read. According to Goodreads, I spent almost nine years reading it. It’s a lot, but it’s very good, and I think I will re-read parts of it again, probably several times. It’s one of the key texts of Situationism, a movement in post-WWII European socialism. The book is made up of 221 numbered paragraphs, which construct and explain the concept of “the spectacle”, a 20th (and, I’d say, 21st) conception of the problems of capitalism and, to an extent, imagined solutions. It’s short, but each paragraph deserves a good long think. You can’t just sit down and read the book in an afternoon the way you could a 150 page book about Animal Crossing.

For a long time, I have wanted to read more detailed writing on the Noble Eightfold Path, so I finally did. I read The Noble Eightfold Path: Way to the End of Suffering by Bhikku Bodhi. I’m glad I did, but it’s not easy to recommend generally. First, you need to be interested in Buddhism in general. Then, you need to have read enough about it (I think) that you want to read what is almost a technical manual about some of the core tenets. It’s a bit like reading a catechism, in which serious religious, metaphysical, and practical questions are answered in great and careful detail for the dedicated lay reader. I wish it had been a bit more focused on description and less on instruction. That is: I wanted to read analysis of and relationship between the eight practices, rather than a book intended to convince me of their importance. Still, this got close and I’m glad I read it.

What’s next?

I have no idea! Well, not much of an idea. Goodreads reminds me that I’m currently reading books about Eiffel, Rust, and WebAssembly. I received a few books for Christmas, and of course I already have a huge backlog of owned and unread books. There are just a few Egan novels I haven’t read yet. Lots of books remain on my “effectiveness” shelf. We’ll see where the year takes me.

One thing is seeming increasingly likely, though. I’ve read Gene Wolfe’s Book of the New Sun three (I think) times, now. These books get better as you re-read them and try to work out the many mysteries within them. Last time I read them, I thought, “When I read these again, it will be with a notebook for taking notes.” I think this is the year. I might also finally listen to ReReading Wolfe an epic-length podcast that goes through the books chapter-by-chapter, just for people who are re-reading the books, so spoilers a-plenty. I’ve been thinking about trying to find old hardback copies of the books to mark up, but it seems like most of them are quite expensive!

At any rate, maybe in a year I’ll write another blog post like this one. If I do, I hope it will be able to mention at least 36 books I’ve read in 2026.

In an attempt to avoid switch /x for a complex regular expression, I tried to replace qr/string/ with the following expression:

(map { qr/$_/ } ("a more" . "complex regex"))[0]

As the latter expression uses double quoted strings, I thought I would have to duplicate any backslash that should go into the qr operator. Because of that, I tried something like this:

(map { qr/$_/ } (
  "^Load key \"\\Q$host_CA\\E\": " 
  . 'incorrect passphrase supplied ' 
  . "to decrypt private key$CRLF"
))

However, Perl 5.26 complains with the following error message:

Unrecognized escape \Q passed through in regex; marked by <-- HERE in m/^Load key "\Q <-- HERE ...

It seems I don't have to duplicate the backslash, but I don't understand why. The following examples illustrate the behavior I am seeing:

DB<3> $x='f*o'

DB<4> x qr/\Q$x\E/
0  (?^u:f\\*o)
   -> qr/(?^u:f\*o)/

DB<5> x map { qr/$_/ } ("\\Q$x\\E/")
0  (?^u:\\Qf*o\\E/)
   -> qr/(?^u:\Qf*o\E\/)/

DB<6> x map { qr/$_/ } ("\Q$x\E/")
0  (?^u:f\\*o/)
   -> qr/(?^u:f\*o\/)/
As preparations are underway for the Perl Toolchain Summit 2026, this short episode is the last excerpt recorded during PTS 2025 (as a tangent during the CPAN Testers interview, published as episode 6). BooK starts by explaining the selection process for the Perl Toolchain Summit and some of the history, and then Doug, Ruth, Breno and Ferki reminisce about what makes the event so special.

(dlxxx) 6 great CPAN modules released last week

Niceperl
Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::DBBrowser - Browse SQLite/MySQL/PostgreSQL databases and their tables interactively.
    • Version: 2.438 on 2025-12-25, with 18 votes
    • Previous CPAN version: 2.437_05 was 7 days before
    • Author: KUERBIS
  2. Convert::Pheno - A module to interconvert common data models for phenotypic data
    • Version: 0.29 on 2025-12-23, with 15 votes
    • Previous CPAN version: 0.28 was 8 months, 4 days before
    • Author: MRUEDA
  3. Devel::MAT - Perl Memory Analysis Tool
    • Version: 0.54 on 2025-12-26, with 30 votes
    • Previous CPAN version: 0.53 was 1 year, 9 months, 19 days before
    • Author: PEVANS
  4. Finance::Quote - Get stock and mutual fund quotes from various exchanges
    • Version: 1.68 on 2025-12-21, with 145 votes
    • Previous CPAN version: 1.68 was 3 days before
    • Author: BPSCHUCK
  5. HTTP::Tiny - A small, simple, correct HTTP/1.1 client
    • Version: 0.092 on 2025-12-27, with 115 votes
    • Previous CPAN version: 0.091 was 14 days before
    • Author: HAARG
  6. JSON::Schema::Modern - Validate data against a schema using a JSON Schema
    • Version: 0.631 on 2025-12-25, with 16 votes
    • Previous CPAN version: 0.630 was 10 days before
    • Author: ETHER

I am developing a Virtualmin plugin. But the problem is to have a link appear under the "Manage Virtual Server" category in the Virtualmin sidebar whenever the feature is enabled for a virtual server (domain).

Despite following the standard plugin structure, the menu item refuses to appear in the Virtualmin UI, although the module is accessible if I manually type the URL or find it in the Webmin "Tools" section (when not hidden).

Environment

  • OS: Ubuntu 22.04 / 24.04
  • Virtualmin version: Latest
  • Webmin version: Latest

File Structure

/usr/share/webmin/my-plugin-folder/

  • index.cgi
  • module.info
  • virtual_feature.pl
  • ...

Relevant Code

virtual_feature.pl

require 'my-plugin-lib.pl';

sub feature_name {
    return "plugin_name";
}

sub feature_label {
    return "Plugin Name";
}

sub feature_disables {
    return 1;
}

sub feature_check {
    return undef;
}

sub feature_setup {
    my ($d) = @_;
    return undef;
}

sub feature_links {
    my ($d) = @_;
    # This is intended to place the link under "Manage Virtual Server"
    return ({ 'mod' => $module_name,
              'desc' => "Plugin Name",
              'page' => "index.cgi?dom=" . $d->{'id'},
              'cat' => 'server' }); 
}

1;

module.info

desc=Plugin Name Tool
os_support=*-linux
version=1.6
category=server
depends=virtual-server
virtualmin=1
hidden=1

Expected Behavior

After enabling the feature globally in System Settings -> Features and Plugins, a link should appear in the left-hand sidebar under the "Manage Virtual Server" category.

Actual Behavior

The feature shows up in the "Enabled features" list and can be toggled/saved successfully. However, the link never appears in the sidebar. No errors are logged in /var/webmin/miniserv.error.

What I have tried

  1. Restarting Webmin (/etc/webmin/restart).
  2. Hardcoding the module folder name in the 'mod' field of feature_links.
  3. Changing the 'cat' field to 'services' or 'logs'.
  4. Refreshing the Webmin module cache.
  5. Verifying that the feature is indeed marked as enabled in the domain's configuration file in /etc/webmin/virtual-server/domains/.

Is there a specific registration step or a required function in virtual_feature.pl that I am missing for the sidebar injection to work correctly in recent versions of the Virtualmin Authentic Theme?

JSON::Schema::Validate

Perl Maven

At an online event through the Perl Maven group we tried to understand this module and even to contriute to it. For more details about the contributions check out the OSDC Perl page.

This example is based on the one in the documentation of the JSON::Schema::Validate and tweaked a bit. It will be useful again if we continue dealing with this module.

examples/json_schema_validate.pl

use JSON::Schema::Validate;
use JSON ();
use open qw( :std :encoding(UTF-8) );
my $schema = {
    '$schema' => 'https://json-schema.org/draft/2020-12/schema',
    '$id'     => 'https://example.org/s/root.json',
    type      => 'object',
    required  => [ 'name' ],
    properties => {
        name => { type => 'string', minLength => 5 },
        next => { '$dynamicRef' => '#Node' },
    },
    '$dynamicAnchor' => 'Node',
    additionalProperties => JSON::false,
};
my $js = JSON::Schema::Validate->new( $schema )
    ->compile
    ->content_checks
    ->ignore_unknown_required_vocab
    ->prune_unknown
    ->register_builtin_formats
    ->trace
    ->trace_limit(200) # 0 means unlimited
    ->unique_keys; # enable uniqueKeys

    #my $data = {
    #    name => 'head',
    #    next => {
    #        name => 'tail'
    #    }
    #};
    #my $data = {
    #    name => 23,
    #    next => {
    #        name => 'tail'
    #    }
    #};
    #my $data = {
    #    name => 'head',
    #};
my $data = {
    name => 'head big',
};


my $ok = $js->validate($data)
    or die( $js->error );
print "ok\n";

Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::Netdisco - An open source web-based network management tool.
    • Version: 2.097000 on 2025-12-16, with 810 votes
    • Previous CPAN version: 2.096001 was 2 days before
    • Author: OLIVER
  2. CPANSA::DB - the CPAN Security Advisory data as a Perl data structure, mostly for CPAN::Audit
    • Version: 20251221.001 on 2025-12-21, with 25 votes
    • Previous CPAN version: 20251214.001 was 7 days before
    • Author: BRIANDFOY
  3. Dist::Zilla::Plugin::Test::Compile - Common tests to check syntax of your modules, using only core modules
    • Version: 2.059 on 2025-12-16, with 13 votes
    • Previous CPAN version: 2.058 was 7 years, 11 months, 27 days before
    • Author: ETHER
  4. Image::ExifTool - Read and write meta information
    • Version: 13.44 on 2025-12-15, with 44 votes
    • Previous CPAN version: 13.36 was 3 months, 6 days before
    • Author: EXIFTOOL
  5. JSON::Schema::Modern - Validate data against a schema using a JSON Schema
    • Version: 0.630 on 2025-12-14, with 16 votes
    • Previous CPAN version: 0.629 was 2 days before
    • Author: ETHER
  6. List::Gen - provides functions for generating lists
    • Version: 0.979 on 2025-12-21, with 24 votes
    • Previous CPAN version: 0.978
    • Author: SOMMREY
  7. Minilla - CPAN module authoring tool
    • Version: v3.1.29 on 2025-12-17, with 98 votes
    • Previous CPAN version: v3.1.28 was 3 months, 2 days before
    • Author: SYOHEX
  8. Module::CoreList - what modules shipped with versions of perl
    • Version: 5.20251220 on 2025-12-20, with 44 votes
    • Previous CPAN version: 5.20251120 was 1 month before
    • Author: BINGOS
  9. Mouse - Moose minus the antlers
    • Version: v2.6.1 on 2025-12-20, with 63 votes
    • Previous CPAN version: v2.6.0 was 1 month, 20 days before
    • Author: SKAJI
  10. PGXN::API - Maintain and serve a REST API to search PGXN mirrors
    • Version: v0.21.0 on 2025-12-15, with 18 votes
    • Previous CPAN version: v0.20.2 was 1 year, 9 months before
    • Author: DWHEELER
  11. Sidef - The Sidef Programming Language
    • Version: 25.12 on 2025-12-21, with 121 votes
    • Previous CPAN version: 24.11 was 1 year, 22 days before
    • Author: TRIZEN
  12. Text::Markup - Parse text markup into HTML
    • Version: 0.41 on 2025-12-18, with 12 votes
    • Previous CPAN version: 0.40 was 3 days before
    • Author: DWHEELER
  13. Unicode::UTF8 - Encoding and decoding of UTF-8 encoding form
    • Version: 0.63 on 2025-12-20, with 20 votes
    • Previous CPAN version: 0.62 was 8 years, 8 months, 9 days before
    • Author: CHANSEN
  14. Zonemaster::Backend - A system for running Zonemaster tests asynchronously through an RPC-API
    • Version: 12.0.0 on 2025-12-19, with 16 votes
    • Previous CPAN version: 11.5.0 was 5 months, 22 days before
    • Author: ZNMSTR
  15. Zonemaster::Engine::Exception::NormalExit - run Zonemaster tests from the command line
    • Version: 8.000001 on 2025-12-19, with 23 votes
    • Previous CPAN version: 8.000000 was 5 months, 22 days before
    • Author: ZNMSTR
  16. Zonemaster::Engine - A tool to check the quality of a DNS zone
    • Version: 8.001000 on 2025-12-19, with 35 votes
    • Previous CPAN version: 8.000000 was 5 months, 22 days before
    • Author: ZNMSTR

(dcxxi) metacpan weekly report - MCP

Niceperl

This is the weekly favourites list of CPAN distributions. Votes count: 43

Week's winner: MCP (+3)

Build date: 2025/12/21 13:03:54 GMT


Clicked for first time:

  • App::BlurFill - Blurred background fill image processor
  • Complete::Getopt::Long - Complete command-line argument using Getopt::Long specification
  • Data::Turtle - Turtle Movement and State Operations
  • Marlin - 🐟 pretty fast class builder with most Moo/Moose features 🐟
  • Mojo::Collection::XS - Fast XS subclass of Mojo::Collection with XS-based while
  • SimpleFlow - easy, simple workflow manager (and logger); for keeping track of and debugging large and complex shell command workflows

Increasing its reputation:

Toronto Perl Mongers meeting, December 6, 2025 / How SUSE is using Perl

The Perl and Raku Conference YouTube channel