Shortcuts: s show h hide n next p prev

TLS in Perl REST request

Perl questions on StackOverflow

This curl command works fine (must be accessed in tls 1.3):

curl -v --tlsv1.3 -u "xxxx":"xxx" -X POST "https:xxxxxxx"

Try to get the same result with Perl. Created a REST::Client with a specific user agent in that way:

$ua = new LWP::UserAgent( 'ssl_opts' => { SSL_version => 'SSLv23:!TLSv1:!TLSv1_1:!SSLv3:!SSLv2', } );
$CLIENT_DCIM=REST::Client->new(
{
    host => $IP_DCIM,
    timeout => 90,
    useragent => $ua
}

When running, I get theses messages with a 500 http error :

Can\'t connect to xxxxxx:443 (SSL connect attempt failed error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version)

SSL connect attempt failed error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version at /root/perl5/lib/perl5/LWP/Protocol/http.pm line 50.

Tried multiple values of 'SSL_version' (according to IO::Socket::SSL) with no success.

Where am I wrong?


Tony write:

``` In addition to the typical stream of small changes to review, Dave's second AST rebuild of ExtUtils::ParseXS arrived (#23883), and I spent several hours reviewing it.

In response to #23918 I worked on adding numeric comparison APIs, which are complicated by overloading, NaNs, SVs dual IV/NV implmentation, and of course by overloading. This includes some fixes for the existing sv_numeq() API. You can see the current state of this work in #23966.

[Hours] [Activity] 2025/11/03 Monday 0.37 #23886 review and approve 0.22 #23873 review other comments and follow-up 0.47 #23887 review, research and approve 1.72 #23890 review, testing 0.23 #23890 comment 0.08 #23891 review and approve 0.18 #23895 review and approve

0.67 #23896 review and comment

3.94

2025/11/04 Tuesday 0.57 coverity scan results, testing, comment on #23871 1.15 #23885 review and comment 1.03 #23871 testing per wolfsage’s example, work on a regression test and fix, testing, push to PR 23897 1.67 #21877 debugging, fix my understanding on PerlIO and the

code, testing

4.42

2025/11/05 Wednesday 0.70 #23897 fix non-taint perl, testing and update PR 0.58 #23896 recheck 1.50 #23885 comment 0.57 #21877 look into remaining test failure, find the cause

and workaround it

3.35

2025/11/06 Thursday 0.08 #23902 review and approve 0.08 #23898 review and approve 0.55 #23899 review and approve 0.97 #23901 review and approve 0.95 #23883 review

1.40 #23883 review up to Node::include

4.03

2025/11/10 Monday 1.60 #23795 review updates, comment 0.35 #23907 review, research and approve 1.07 #23908 review, research, comment (fixed while I worked)

0.63 #23883 continue review, comment

3.65

2025/11/11 Tuesday 0.57 #23908 review updates and approve 0.40 #23911 review, review history of associated ticket and approve 0.85 #23883 more review

1.37 #23883 more review

3.19

2025/11/12 Wednesday 0.73 #23913 review, research and approve 0.77 #23914 review, check for SvIsUV() usage on CPAN 0.83 #23910 testing, get some strange results 0.82 #23910 debugging, can’t reproduce in new builds

0.67 #23883 more review

3.82

2025/11/13 Thursday 0.73 #23918 review discussion and research 0.75 #23917 review and approve 0.23 #23919 review and approve 1.03 #23883 more review

1.27 #23883 more review

4.01

2025/11/17 Monday 1.13 testing, comments on new XS API list thread 0.97 #23923 review and approve 1.25 #23914 testing, comment, review 0.43 #23914 more review and approve

0.93 #23888 review, comments, some side discussion of 23921

4.71

2025/11/18 Tuesday 0.50 #23888 review updates, testing,approve 0.27 #23943 review and approve 0.52 #23883 more review

1.27 #23883 more review

2.56

2025/11/19 Wednesday 0.78 #23922 review and approve 1.08 #23918 work on new compare APIs 0.53 #23918 debugging 1.22 #23918 testing, cleanup

0.82 #23918 re-work documentation

4.43

2025/11/20 Thursday 2.50 #23918 work on sv_numcmp(), research, test code, testing, debugging 1.07 #23918 work out an issue, more testing, document sv_numcmp

variants

3.57

2025/11/24 Monday 0.08 #23819 review and approve 2.77 #23918 NULL tests and fix, test for NV/IV mishandling and fix 0.82 #23918 open #23956, start on le lt ge gt implementation

1.20 #23918 finish implementation, test code, testing

4.87

2025/11/25 Tuesday 0.67 #23885 review, comment 1.13 #23885 more review

1.03 #23918 some polish

2.83

2025/11/26 Wednesday 0.07 #23960 review and approve 2.07 #23885 review, research and comments 0.48 #23918 more polish, testing

1.60 #23918 finish polish, push for CI

4.22

2025/11/27 Thursday 0.58 #23918 check CI, add perldelta and push 0.58 check CI results and make PR 23966

0.48 comment on dist discussion on list

1.64

2025/11/28 Friday

0.18 #23918 fix a minor issue

0.18

Which I calculate is 59.42 hours.

Approximately 32 tickets were reviewed or worked on. ```


Dave writes:

Last month was relatively quiet.

I worked on a couple of bugs and did some final updates to my branch which rewrites perlxs.pod - which I intend to merge in the next few days.

Summary:

  • 10:33 GH #16197 re eval stack unwinding
  • 4:47 GH #18669 dereferencing result of ternary operator skips autovivification
  • 2:06 make perl -Dx display lexical variable names
  • 10:58 modernise perlxs.pod

Total:

  • 28:24 TOTAL (HH::MM)

Prototype mismatch importing symbols

Perl questions on StackOverflow

I get a "Prototype mismatch" warning in code that either imports 'blessed' from Scalar::Util or defines it depending on the version of Scalar::Util. Is there a way to suppress the warning, or do I need to turn off signatures and add a prototype to my own blessed() sub as shown below? Am I correct that the latter is only a good practice if Scalar::Util::blessed will have a prototype forever? (I don't understand why it has a prototype now.)

use strict; use warnings;
use feature qw( unicode_strings signatures );
no warnings 'experimental::signatures';
use utf8;
use version;

require Scalar::Util;
if ( version->parse( Scalar::Util->VERSION ) >= version->parse( '1.53' ) ) {
    STDERR->print( "Using Scalar::Util::blessed()\n" );
    Scalar::Util->import( 'blessed' );
}
else {
    STDERR->print( "Using our own blessed()\n" );
    no feature 'signatures';
    sub blessed($) {
      # Workaround for Scalar-List-Utils bug #124515 fixed in 1.53 (Perl v5.31.6)
      my $class = Scalar::Util::blessed($_[0]);
      utf8::decode($class) if defined($class) && ! utf8::is_utf8($class);
      return $class;
    }
}

sub new ( $class, %arg ) {
    return bless( { CODE => '', %arg }, $class );
}
RMG - Link reference commit in section Bump the version number

Co-authored-by: Eric Herman <eric@freesa.org>
Remove SBOX case statements from external visibility

I'm pretty sure there is no use case for these, and very unlikely to
have any actual uses.
Remove a few more macros from being visible to XS code

These are a few macros dealing with inversion lists that were never
intended to be visible to general XS code, and they actually can't be in
use in cpan because the mechanisms to create inversion lists are private
to perl.
Macros guarded by some #ifdef's aren't globally visible

The previous commit undefines macros that aren't supposed to be visible
to XS code.  But, to avoid any possible breakage, it creates an
exception list of symbols that may have been visible, and leaves them
so.  The goal is to stop the list from growing as new code is developed,
and to shorten the list by various means.

This is the first commit to do that, by looking to see if any symbols
aren't actually externally visible because they are guarded by #ifdef's
that evaluate to false.  For example a symbol that is #defined only if
PERL_CORE is defined won't be visible, and need not be on the exception
list.

This cuts almost 30% off the initial list.

Gain control of macro namespace visibility

Perl commits on GitHub
Gain control of macro namespace visibility

This commit adds the capability to undefine macros that are visible to
XS code but shouldn't be.  This can be used to stop macro namespace
pollution by perl.

It works by changing embed.h to have two modes, controlled by a #ifdef
that is set by perl.h.  perl.h now #includes embed.h twice.  The first
time works as it always has.  The second sets the #ifdef, and causes
embed.h to #undef the macros that shouldn't be visible.  This call is
just before perl.h returns to its includer, so that these macros have
come and gone before the file that #included perl.h is affected by them.
It comes after the inline headers get included, so they have access to
all the symbols that are defined.

The list of macros is determined by the visibility given by the apidoc
lines documenting them, plus several exception lists that allow a symbol
to be visible even though it is not documented as such.

In this commit, the main exception list contains everything that is
currently visible outside the Perl core, so this should not break any
code.  But it means that the visibility control is established for
future changes to our code base.  New macros will not be visible except
when documented as needing to be such.  We can no longer inadvertently
add new names to pollute the user's.

I expect that over time, the exception list will become smaller, as we
go through it and remove the items that really shouldn't be visible.  We
can then see via smoking if someone is actually using them, and either
decide that these should be visible, or work with the module author for
another way to accomplish their needs.  (I would hope this would lead to
proper documentation of the ones that need to be visible.)

There are currently four lists of symbols.

One list is for symbols that are used by libc functions, and that Perl
may redefine (usually so that code doesn't have to know if it is running
on a platform that is lacking the given feature.)  The algorithm added
here catches most of these and keeps them visible, but there are a few
items that currently must be manually listed.

A second list is of symbols that the re extension to Perl requires, but
no one else needs to.  This list is currently empty, as everything
initially is in the main exception list.

A third list is for items that other Perl extensions require, but no one
else needs to.  This list is currently empty, as everything initially is
in the main exception list.

The final list is for items that currently are visible to the whole
world.  It contains thousands of items.  This list should be examined
for:

    1) Names that shouldn't be so visible; and
    2) Names that need to remain visible but should be changed so they
       are less likely to clash with anything the user might come up
       with.

I have wanted this ability to happen for a long time; and now things
have come together to enable it.

This allows us to have a clear-cut boundary with CPAN.

It means you can add macros that have internal-only use without having
to worry about making them likely not to clash with user names.

It shows precisely in one place what our names are that are visible to
CPAN.

The Day Perl Stood Still: Unveiling A Hidden Power Over C

r/perl

Layout strategy for a script with supporting functions

r/perl

I use a script called ls2htm when I want to show a small directory as a halfway-decent webpage. Here's an example.

I borrowed some defaults from Apache autoindex. If the directory holds

optional HEADER.htm (or HEADER.txt) f1.txt f2.c optional README.htm (or README.txt) 

then index.htm would hold

Title Included HEADER File display: icon filename modtime size description-if-any DIR .. - - Parent directory TXT f1.txt ... Some neat text file C f2.c ... Equally nifty C program Included README Footer with last-modified date, page version, etc 

I have some functions that are useful on their own:

dir2json: File metadata, description, etc. stored as JSON array dir2yaml: Same things stored as YAML array json2htm, yaml2htm: Convert arrays to Apache autoindex format 

My first thought was just make a module, but it occurred to me that writing it as a modulino would make it easier for others to install and use.

Suggestions?

submitted by /u/vogelke
[link] [comments]

A Pod plugin for VSCode

r/perl

I have a regular expression in an extension of java by florian ingerl

to parse a latex command,

\\\\DocumentMetadata(?<docMetadata>\\{(?:[^{}]|(?'docMetadata'))*\\})

but the important thing is that it allows nested braces.

The concrete string to be matched is

\DocumentMetadata{pdfversion=1,7,pdfstandard={a-3b,UA-1}}

but earlier or later time comes where deeper nesting is required.

To that end, I use the extension com.florianingerl.util.regex.MatchResult of the builtin java regular expressions.

Now I want to use latexmk for latex which is in Perl and need to adapt .latemkrc which is just Perl code.

So i need the same regular expression or at least something similar i can automatically transform.
Up to now each expression worked in both worlds. But this one does not match in Perl.

Maybe there is some extension which does.

I found in Perl recursion but not with named groups.

Perl 🐪 Weekly #750 - Perl Advent Calendar 2025

dev.to #perl

Originally published at Perl Weekly 750

Hi there,

One of the most enjoyable yearly customs in the community, the Perl Advent Calendar 2025, is being introduced this week. A new article, tutorial, or in-depth analysis demonstrating the ingenuity and skill that continue to propel Perl forward is released every day.

The calendar has something for every skill level, whether you're interested in cutting-edge Perl techniques, witty one-liners, CPAN gems, or true engineering tales. It serves as a reminder that Perl's ecosystem is still active, creative, and developing-driven by a fervent community that enjoys exchanging knowledge.

If you still want more, be sure to check out, The Weekly Challenge Advent Calendar 2025. There you'll find not just Perl, but Raku as well.

Last but not least, I'd like to extend my heartfelt thanks to Gabor Szabo for kindly promoting my book: Design Patterns in Modern Perl - your support means a great deal. And to the Perl community: thank you for embracing my first book with such warmth and encouragement. Your enthusiasm continues to inspire me.

Enjoy rest of the newsletter, stay safe and healthy.

--
Your editor: Mohammad Sajid Anwar.

Articles

PAGI: ASGI For Perl, or the Spiritual Successor to Plack

PAGI (Perl Asynchronous Gateway Interface) is a new specification for async Perl web applications, inspired by Python's ASGI. It supports HTTP, WebSockets, and Server-Sent Events natively, and can wrap existing PSGI applications for backward compatibility.

plenv-where

A plenv plugin to show which Perl versions have a particular module.

LPW 2025 - Event Report

Here is my detailed report of LPW 2025 that includes the slides of my presentation.

Living Perl: Building a CNN Image Classifier with AI::MXNet

This article demonstrates Perl's continued relevance in cutting-edge fields by showcasing integration with MXNet, a major deep learning framework. The ability to build convolutional neural networks (CNNs) in Perl for image classification represents significant technical sophistication.

Perl Advent Calendar

The Ghost of Perl Developer Surveys Past, Present, and Future

The article demonstrates sophisticated understanding of developer tooling ecosystems and community trends. The comparison between 2009-2010 surveys and the 2025 results shows deep insight into how Perl development practices have evolved while maintaining continuity.

All I Want for Christmas Is the Right Aspect Ratio

The step-by-step progression from simple Perl script to full Docker deployment serves as an excellent tutorial on modern Perl module distribution. It shows how a well-designed module can serve diverse audiences from command-line power users to web developers to DevOps teams.

Santa's Secret Music Studio

The step-by-step approach from "need to identify devices" to "controlling a synth" serves as an excellent mini-tutorial. The mention of related modules (MIDI::RtController, MIDI::RtController::Filter::Tonal) provides helpful pointers for readers wanting to explore further.

Stopping the Evil Grinch: A Holiday Defense Guide

This article demonstrates enterprise-grade security automation using Perl as a robust orchestration layer. The solution elegantly combines multiple security tools (Lynis for auditing, ClamAV for malware scanning) with professional email reporting.

Santa needs to know about new toys...

This article successfully teaches professional API integration through storytelling, making technical concepts accessible while demonstrating production-ready Perl code patterns. The holiday theme enhances rather than distracts from the educational content.

ToyCo want to push new toy updates

This article beautifully demonstrates transitioning from a polling-based API client to a webhook-based service - a common and important architectural pattern in modern web development. The scenario of "crippling ToyCo's servers" with excessive polling is both realistic and educational.

Abstract storage of Christmas letters

This solution demonstrates sophisticated software design with the strategic use of Storage::Abstract to create a clean abstraction layer between business logic and data storage. The anticipation of changing storage requirements and preemptive abstraction is professional forward-thinking.

The Weekly Challenge

The Weekly Challenge by Mohammad Sajid Anwar will help you step out of your comfort-zone. You can even win prize money of $50 by participating in the weekly challenge. We pick one champion at the end of the month from among all of the contributors during the month, thanks to the sponsor Lance Wicks.

The Weekly Challenge - 351

Welcome to a new week with a couple of fun tasks "Special Average" and "Arithmetic Progression". If you are new to the weekly challenge then why not join us and have fun every week. For more information, please read the FAQ.

RECAP - The Weekly Challenge - 350

Enjoy a quick recap of last week's contributions by Team PWC dealing with the "Good Substrings" and "Shuffle Pairs" tasks in Perl and Raku. You will find plenty of solutions to keep you busy.

TWC350

This implementation demonstrates elegant Perl craftsmanship. The good substrings solution is particularly clever, using a regex lookahead to capture all overlapping 3-character substrings in one pass, then filtering to ensure no repeated characters - a beautifully concise one-liner.

The Good Shuffle

The solutions demonstrate strong understanding of both algorithmic thinking and Raku language features. The shuffle pairs solution is particularly clever in its use of canonical forms and early termination conditions.

Good Substring / Shuffle Pairs

The Perl implementation demonstrates clean, readable code with thoughtful organization. The good substrings solution uses efficient array slicing and clear manual comparison logic that's easily understandable.

Shuffled Strings

This is an exceptionally elegant Perl implementation showcasing expert-level Perl idioms. Both solutions exemplify Perl's philosophy of "making easy things easy and hard things possible" with concise, expressive code that solves the problems elegantly without unnecessary complexity.

only Perl!

This is a comprehensive and impressively diverse implementation across multiple languages and environments. The Raku solutions showcase excellent use of the language's functional features. The PL/Perl implementations are particularly noteworthy for their adaptability to database environments.

Perl Weekly Challenge 350

This solution stands out for its deep mathematical analysis and optimization. The Task 2 solution demonstrates remarkable theoretical insight by using modular arithmetic with modulo 9 to significantly reduce the search space - achieving a 5.2x speedup is an impressive feat of algorithmic optimization.

Shuffling the Good

This solution demonstrates exceptional cross-language programming skills with clean, idiomatic implementations across four different languages (Raku, Perl, Python, Elixir). The consistent algorithmic approach while respecting each language's unique idioms shows deep understanding of multiple programming paradigms.

Good pairs

Both solutions showcase excellent Perl craftsmanship with thoughtful comments, clear variable naming, and robust handling of edge cases. Peter demonstrates both theoretical understanding (mathematical bounds, algorithmic complexity) and practical implementation skills.

The Weekly Challenge #350

This is a masterclass in professional Perl documentation and code structure. The solutions feature comprehensive POD documentation with clear attribution, problem descriptions, notes, and IO specifications - demonstrating exceptional software engineering practices.

A Good Shuffle

This solution demonstrates elegant Perl craftsmanship with a particularly clever approach. Using a regex with a lookahead assertion /(?=(...))/g to capture overlapping substrings is an expert-level Perl idiom that showcases deep understanding of the language's regex capabilities.

Good shuffling

This solution demonstrates excellent cross-language programming skills with clear parallel implementations in both Python and Perl. The Task 1 solution is elegantly simple - the Python version using set(substr) for uniqueness checking and the Perl version using a hash with early returns showcase appropriate idioms for each language while maintaining the same algorithmic approach.

Rakudo

2025.48 Advent is Here

Weekly collections

NICEPERL's lists

Great CPAN modules released last week;
MetaCPAN weekly report.

Events

Paris.pm monthly meeting

December 10, 2025

German Perl/Raku Workshop 2026 in Berlin

March 16-18, 2025

You joined the Perl Weekly to get weekly e-mails about the Perl programming language and related topics.

Want to see more? See the archives of all the issues.

Not yet subscribed to the newsletter? Join us free of charge!

(C) Copyright Gabor Szabo
The articles are copyright the respective authors.

Welcome to the Week #351 of The Weekly Challenge.

Advent Calendar 2025

The Weekly Challenge
It’s amazing to look back and see how far we’ve come - and none of it would have been possible without the energy, passion, and teamwork of everyone in Team PWC. Thanks for bringing the magic to life once again!
The gift is presented by Jaldhar H. Vyas. Today he is talking about his solutioni to The Weekly Challenge - 310. This is re-produced for Advent Calendar 2025 from the original post.
Thank you Team PWC for your continuous support and encouragement.

Weekly Challenge: Good shuffling

dev.to #perl

Weekly Challenge 350

Each week Mohammad S. Anwar sends out The Weekly Challenge, a chance for all of us to come up with solutions to two weekly tasks. My
solutions are written in Python first, and then converted to Perl. It's a great way for us all to practice some coding.

Challenge, My solutions

Task 1: Good Substrings

Task

You are given a string.

Write a script to return the number of good substrings of length three in the given string. A string is good if there are no repeated characters.

My solution

This is relatively straight forward, so doesn't require much explanation. I have a variable called i that starts at zero and ends with three less than the length of the string. For each iteration, I extract the the three letters starting at the specified position. If the three letters are unique, I add one to the count variable.

def good_substr(input_string: str) -> int:
    count = 0

    for i in range(len(input_string) - 2):
        substr = input_string[i:i + 3]

        if len(set(substr)) == 3:
            count += 1

    return count

In Python, I convert the three characters to a set. If the length of the set is 3, I know all characters are unique (sets can only store unique values). In Perl, I use a %chars hash. If a letter is already seen, it will return 0.

sub is_unique ($substr) {
    my %chars;
    foreach my $char (split //, $substr) {
        return 0 if exists $chars{$char};
        $chars{$char} = 1;
    }
    return 1;
}

Examples


$ ./ch-1.py abcaefg
5

$ ./ch-1.py xyzzabc
3

$ ./ch-1.py aababc
1

$ ./ch-1.py qwerty
4

$ ./ch-1.py zzzaaa
0

Task 2: Shuffle Pairs

Task

If two integers A <= B have the same digits but in different orders, we say that they belong to the same shuffle pair if and only if there is an integer k such that B = A × k where k is called the witness of the pair.

For example, 1359 and 9513 belong to the same shuffle pair, because 1359 * 7 = 9513.

Interestingly, some integers belong to several different shuffle pairs. For example, 123876 forms one shuffle pair with 371628, and another with 867132, as 123876 × 3 = 371628, and 123876 × 7 = 867132.

Write a function that for a given $from, $to, and $count returns the number of integers $i in the range $from <= $i <= $to that belong to at least $count different shuffle pairs.

My solution

This is an interesting challenge as the solution requires some thinking. Additionally there was an error with the original page, so I raised a pull request to fix it.

I start by setting the shuffle_pairs variable to zero. I have a loop with the variable value that starts at from and ends with to (inclusive). As from is a reserved word in Python, I use the variables start and end.

For each iteration of value, I do the following.

  1. Set the variable this_count to zero. This will store the number of shuffle pairs for this value.
  2. Set the variable multiplier to 2.
  3. Set a variable called candidate to the product of value and multiplier.
  4. If this is a shuffle_pair, increment the this_count value.
  5. If the candidate is equal to or greater then 10length of value, end the loop. As this number has more digits than the original, no shuffle pair can be found for this candidate.
  6. Once the previous step is reached, increment the shuffle_pair value if this_count >= count.
def shuffle_pairs(start: int, stop: int, count: int) -> int:
    shuffle_pairs = 0

    for value in range(start, stop + 1):
        this_count = 0

        multiplier = 2
        while True:
            candidate = value * multiplier
            if candidate >= 10 ** len(str(value)):
                break
            if is_shuffle_pair(value, candidate):
                this_count += 1

            multiplier += 1

        if this_count >= count:
            shuffle_pairs += 1

    return shuffle_pairs

The Perl solution follows the same logic.

Examples

$ ./ch-2.py 1 1000 1
0

$ ./ch-2.py 1500 2500 1
3

$ ./ch-2.py 1000000 1500000 5
2

$ ./ch-2.py 13427000 14100000 2
11

$ ./ch-2.py 1030 1130 1
2
As you know, The Weekly Challenge, primarily focus on Perl and Raku. During the Week #018, we received solutions to The Weekly Challenge - 018 by Orestis Zekai in Python. It was pleasant surprise to receive solutions in something other than Perl and Raku. Ever since regular team members also started contributing in other languages like Ada, APL, Awk, BASIC, Bash, Bc, Befunge-93, Bourne Shell, BQN, Brainfuck, C3, C, CESIL, Chef, COBOL, Coconut, C Shell, C++, Clojure, Crystal, CUDA, D, Dart, Dc, Elixir, Elm, Emacs Lisp, Erlang, Excel VBA, F#, Factor, Fennel, Fish, Forth, Fortran, Gembase, Gleam, GNAT, Go, GP, Groovy, Haskell, Haxe, HTML, Hy, Idris, IO, J, Janet, Java, JavaScript, Julia, K, Kap, Korn Shell, Kotlin, Lisp, Logo, Lua, M4, Maxima, Miranda, Modula 3, MMIX, Mumps, Myrddin, Nelua, Nim, Nix, Node.js, Nuweb, Oberon, Octave, OCaml, Odin, Ook, Pascal, PHP, PicoLisp, Python, PostgreSQL, Postscript, PowerShell, Prolog, R, Racket, Rexx, Ring, Roc, Ruby, Rust, Scala, Scheme, Sed, Smalltalk, SQL, Standard ML, SVG, Swift, Tcl, TypeScript, Typst, Uiua, V, Visual BASIC, WebAssembly, Wolfram, XSLT, YaBasic and Zig.

(dlxxvii) 12 great CPAN modules released last week

Niceperl
Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::cpm - a fast CPAN module installer
    • Version: 0.998002 on 2025-12-04, with 177 votes
    • Previous CPAN version: 0.998001 was 21 days before
    • Author: SKAJI
  2. App::HTTPThis - Export the current directory over HTTP
    • Version: 0.010 on 2025-12-04, with 24 votes
    • Previous CPAN version: 0.009 was 2 years, 5 months, 12 days before
    • Author: DAVECROSS
  3. App::Netdisco - An open source web-based network management tool.
    • Version: 2.095006 on 2025-11-30, with 800 votes
    • Previous CPAN version: 2.095005 
    • Author: OLIVER
  4. CPANSA::DB - the CPAN Security Advisory data as a Perl data structure, mostly for CPAN::Audit
    • Version: 20251130.001 on 2025-11-30, with 25 votes
    • Previous CPAN version: 20251123.001 was 7 days before
    • Author: BRIANDFOY
  5. JSON::Schema::Modern - Validate data against a schema using a JSON Schema
    • Version: 0.627 on 2025-12-04, with 15 votes
    • Previous CPAN version: 0.626 was 2 days before
    • Author: ETHER
  6. MetaCPAN::Client - A comprehensive, DWIM-featured client to the MetaCPAN API
    • Version: 2.034000 on 2025-12-03, with 27 votes
    • Previous CPAN version: 2.033000 was 1 year, 8 days before
    • Author: MICKEY
  7. Minion::Backend::mysql - MySQL backend
    • Version: 1.007 on 2025-12-01, with 13 votes
    • Previous CPAN version: 1.006 was 1 year, 6 months, 9 days before
    • Author: PREACTION
  8. Object::Pad - a simple syntax for lexical field-based objects
    • Version: 0.822 on 2025-11-30, with 46 votes
    • Previous CPAN version: 0.821 was 4 months, 18 days before
    • Author: PEVANS
  9. Sisimai - Mail Analyzing Interface for bounce mails.
    • Version: v5.5.0 on 2025-12-05, with 81 votes
    • Previous CPAN version: v5.4.1 was 3 months, 5 days before
    • Author: AKXLIX
  10. SPVM - The SPVM Language
    • Version: 0.990108 on 2025-12-03, with 36 votes
    • Previous CPAN version: 0.990107 was 15 days before
    • Author: KIMOTO
  11. Sys::Virt - libvirt Perl API
    • Version: v11.10.0 on 2025-12-01, with 17 votes
    • Previous CPAN version: v11.8.0 was 24 days before
    • Author: DANBERR
  12. Time::Moment - Represents a date and time of day with an offset from UTC
    • Version: 0.46 on 2025-12-04, with 76 votes
    • Previous CPAN version: 0.44 was 7 years, 6 months, 25 days before
    • Author: CHANSEN

This is the weekly favourites list of CPAN distributions. Votes count: 72

Week's winner: JSON::Schema::Validate (+3)

Build date: 2025/12/06 16:48:33 GMT


Clicked for first time:

  • Chess::Plisco - Representation of a chess position with move generator, legality checker etc.
  • Dev::Util - Utilities useful in the development of perl programs
  • Disk::SmartTools - Provide tools to work with disks via S.M.A.R.T.
  • Dump::Krumo - Fancy, colorful, human readable dumps of your data
  • Melian - Perl client to the Melian cache

Increasing its reputation:

Spreadsheet in Perl [closed]

Perl questions on StackOverflow

I am trying to read (and possibly write to) a spreadsheet in perl . Google told me to try spreadsheeet::read. cpan told me it didn't exist. as in

Could not expand [spreadsheet::read]. Check the module name.

I used -x and it included a suggestion for using spreadsheet::XLSX since that was the original type of the spreadsheet I was trying to read. Here are the first few lines of code:

use Spreadsheet::XLSX;

my $parser   = Spreadsheet::ParseExcel->new();
my $workbook = $parser->parse('Holiday Gift Fund.XLSx');

if ( !defined $workbook ) {
    die $parser->error();
}

when I run it under the Perl debugger, I get the message

File not found at calc.pl line 7.
 at calc.pl line 7. 
Debugged program terminated.  Use q to quit or R to restart, 

The xlsx file does exit. Any suggestions for reading it?

PERL Rewards & Earnings Guide — December 2025

Perl on Medium

Discover all the ways to unlock rewards and bonuses with Perlevescava.

Seeking community wisdom on why the ApacheBench utility consistently returns a lot of "Non-2xx responses" ( 502 Bad Gateway ) when running a benchmark test of my helloworld web app using Perl's Net::Async::FastCGI and Nginx as a reverse proxy. The number of concurrent requests is pretty low at 50, but it still return lots of "Non-2xx responses"?? Any insight is greatly appreciated. Please see below for the test code and the ApacheBench command for benchmark test.

It'd be great if someone try running the same thing and see if you experiencing the same issue? All u need is Perl 5.14 or above, install the module Net::Async::FastCGI and the Nginx web server, then configure Nginx with the little snippet to hook it up to the helloworld FastCGI script, then run the script on one terminal, and start Nginx on another terminal.

  1. ApacheBench command:

    ab -l -v 2 -n 100 -c 50 "http://localhost:9510/helloworld/"

    which returns:

    ...
    Concurrency Level: 50
    Time taken for tests: 0.015 seconds
    Complete requests: 100
    Failed requests: 0
    Non-2xx responses: 85 #NOTE: all of these are "502 Bad Gateway"
    ...
    
  2. Nginx config:

    location /helloworld/ {
      proxy_buffering off ;
      gzip off ;
      fastcgi_pass unix:/testFolder/myPath/myUDS.sock ;
      include fastcgi_params ;
    }
    
  3. helloworld test script:

use strict ;
use warnings ;
use IO::Async::Loop ;
use Net::Async::FastCGI ;
# This script will respond to HTTP requests with a simple "Hello, World!" message.

#If using TCP port for communication:
#my $PORT = 9890 ;

#If using Unix domain socket for communication:
my $uds = 'myUDS.sock' ;

# Create an event loop
my $loop = IO::Async::Loop->new() ;

# Define the FastCGI request handler subroutine
sub on_request {#Parms: request(Net::Async::FastCGI::Request); #Return: void;
  my ( $fcgi, $req ) = @_ ;
  # Prepare the HTTP response
  my $response = "Hello, World!\n" ;
  my $respLen  = length( $response ) ;
  # Print HTTP response headers
  $req->print_stdout(
    "Status: 200 OK" . "\n" .
    "Content-type: text/plain" . "\n" .
    "Content-length: " . $respLen . "\n" .
    "\n" .
    $response
  ) ;
  # Finish the request
  $req->finish() ;
}#end sub

# Create a new FastCGI server instance
my $fcgi = Net::Async::FastCGI->new(
  #handle     => \*STDIN ,       # Read FastCGI requests from STDIN
  on_request => \&on_request ,  # Assign the request handler subroutine
) ;

# Add the FastCGI server instance to the event loop
$loop->add($fcgi) ;

$fcgi->listen(
  #service          => $PORT , #if using TCP portnum
  addr => {# if using Unix domain socket
    family   => "unix" ,
    socktype => "stream" ,
    path     => "$uds" ,
  } ,
  #host             => '127.0.0.1' ,
  on_resolve_error => sub { print "Cannot resolve - $_[-1]\n" } ,
  on_listen_error  => sub { print "Cannot listen - $_[-1]\n" } ,
) ;

$SIG{ HUP } = sub {
  system( "rm -f $uds" ) ;
  exit ;
} ;
$SIG{ TERM } = sub {
  system( "rm -f $uds" ) ;
  exit ;
} ;
$SIG{ INT } = sub {
  system( "rm -f $uds" ) ;
  exit ;
} ;

# Run the event loop
$loop->run() ;

Horror Movie Month 2025

rjbs forgot what he was saying

It’s December, and I should’ve posted this in early November, but I didn’t. I did other stuff. Now, though, I’m on a long plane flight, so I guess it’s time to write a bit of bloggery. (Did I really never write up 2024? Well, maybe later.)

Here’s what we watched for Horror Movie Month this year!

October 1: Heart Eyes (2025)

This is “what if a rom-com was also a slasher?” It was a bit uneven, and didn’t quite nail either part – understandable, but still, you hope for excellence, right? It was novel and fun enough that I’m glad to have watched it. It’s from Josh Ruben, who did some other stuff I also liked.

October 2: The Borderlands (2012)

Hey, this was surprisingly decent. Vatican-sponsored ghost hunters are investigating a haunting. Is it bunk or not? There was nothing particularly special about this movie, except that it was well done. Good cast, good pacing. It’s nice to find a new competent, enjoyable, unexpected movie like this!

October 3: Departing Seniors (2023)

There’s a murderer at a high school, and somebody can see the murders coming in visions. It wasn’t good. It wasn’t terrible, but it was retreading old material, and there were lots of little problems. Why was the school always so empty? If it’s nearly graduation day, why are people wearing big coats?

Watch It’s a Wonderful Knife instead, maybe.

October 5: V/H/S/Halloween

I think this was my favorite V/H/S movie so far. They’re all sort of uneven messes, but this one had the most fun I can remember. I especially enjoyed the weird framing story about the soft drink focus group. Also, Fun Size was fun and weird. More like this one, please, V/H/S people!

October 5: Bring Her Back (2025)

I saw this as “best new horror of 2025” a bunch of places. It was a very well made movie, well written, well-acted. It was a good film. It was also just so grim, for much of it. It was a movie full of desperation. Also, it had “people in authority do bad things to kids”, which I don’t like watching. Still, I’m glad I saw it, and yeah, it was good, but I think we usually are looking for something more fun in Horror Movie Month.

October 6: Presence (2024)

Remember in The Menu, how the chef is making this extremely high-technique food that is hard to criticize for any lack of technical merit, but which sparks no joy? That’s this movie. It’s a well-made haunted house movie, but I didn’t care about what happened to anybody in it.

October 7: Ick (2024)

This movie reminded me of Detention (2011) and that’s a good thing. Brandon Routh plays a guy who was a high school football star and is now sort of washed up. There’s a weird black fungus that grows all over everything, and has been there for decades, and nobody cares. It’s just there. Then, one day, things change. It was weird and fun and unexpected. It wasn’t as good as Detention, but I liked it a lot.

October 8: Somnium (2024)

Big change in gears here from the previous movie. This was a slow-paced, moody movie about a would-be actor in LA who gets a job at some kind of weird dream therapy place. It looks right out of an early 80s Cronenberg movie. She walks around the place at three in the morning, surrounded by sleeping people in wood-paneled rooms. I liked it! I also liked that it was very happy to have a simple moral.

October 9: The Collector (2009)

This was pitched as “Would you like to see a movie that’s kind of like Saw but has been mostly forgotten?” It wasn’t as smart as the original Saw trilogy, and went on way too long. It’s sort of “Home Alone, with adults, and lots of blood.”

October 10: Found Footage: The Making of the Patterson Project (2025)

This little indie movie was a mess, but I had fun. There’s a guy who wants to make an indie horror movie, and is not a skilled filmmaker. His crew have a lot of heart, but are sort of hopeless. They send a guy in a bigfoot costume out into the woods in hunting season. Also, the set might be haunted? I enjoyed it.

October 11: I Know What You Did Last Summer (2025)

This could’ve been a lot worse. If you have any fond memories of the original, go for it. Otherwise, eh.

October 11: [REC]⁴ Apocalypse

Gloria and I had seen the previous REC movies, but I don’t think we’d seen this one. It was good! With Satanic zombies back on the mainland, a bunch of people are aboard a ship trying to study the problem. So: people in an enclosed space, at sea, with a zombie out break. You can imagine the rest, mostly. It brought back the newscaster from the first two movies, which was fun. (The original REC is definitely worth watching!)

October 13: How to Kill Monsters (2024)

A police station is taken over by monsters and demons and the riff raff inside have to save themselves. There’s a fun framing device where the movie starts at the end of a horrible bloodbath and then you’re sort of figuring out how it all fits together. Like a bunch of other movies on this list, it was enjoyable, but not great.

October 14: Psycho Therapy: The Shallow Tale of a Writer Who Decided to Write About a Serial Killer (2024)

Every year we end up with one or two movies that we thought were going to be sort of horror-y, or maybe just creepy thrillers, but don’t work out that way. This was one, this year, and no regrets! Steve Buscemi plays a retired serial killer who approaches a struggling writer to pitch a collaboration on memoirs. But after a little bit of farce, he’s roped into pretending to be a marriage councelor for the writer and his wife. Really weird, and who doesn’t want to see more Steve Buscemi?

October 17: Vicious (2025)

Polygon suggested this would be a great pick if you liked Weapons. It wasn’t. I guess I was supposed to feel sympathy and horror for the protagonist, but I didn’t care. It was not compelling or scary.

October 18: Dark Match (2024)

Another “flawed but worth it” entry! A group of near-nobodies in the local professional wrestling circuit book a high-paying gig at a private party in the woods. Could it be that the private party is actually a horrible, horrible place to end up? Yes, it could.

Special note: I kept thinking, “Who is this actor?” Turns out he played Bill in GTA V, and I recognized his face from his likeness in a video game. Woah!

October 19: Together

This was another “best of the year” candidate, said the Internet. It wasn’t my favorite, but it was good. It had some serious ideas, and was nicely creepy, but also had a great sense of humor. It’s hard for me to explain much without saying too much. There’s a couple with a long-term relationship. They move to the sticks for a fresh start. Things get weird. Great pacing in this one, I thought, too.

October 20: Here for Blood (2022)

This was definitely one of the best movies we watched! It reminded us of McG’s The Babysitter (2017), and in a good way. There’s a busy young woman who is a bunch of college classes and maybe a couple jobs, one of which is babysitting. She needs to cram, so her boyfriend agrees to take one of her babysitting jobs. Meanwhile, somebody has planned to stage a home invasion while she’s babysitting. They expect the petit woman, not her massive mixed martial artist boyfriend. It’s a trip!

October 21: Minutes Past Midnight (2016)

I don’t remember it well at this point. It was an anthology. I remember that some of it was terrible. On the other hand, there was one lovely and funny bit with Arthur Darvill as a serial killer who falls in love.

October 22: Willy’s Wonderland (2021)

Not good, per se, but I think I’ll still recommend it. It’s a rip off of Five Nights at Freddy’s, where Nic Cage is conscripted to spend a night cleaning an abandoned Chuck E. Cheese style place. The animatronics are out to get him. He fights back. Also, there are dying teenagers.

This movie is all about Nicholas Cage, who has zero lines and punctuates his night of cleaning and demonocide by drinking soda and playing pinball. Unhinged.

October 23: Cherry Falls (2000)

This got on my radar as “the most underappreciated slasher of the post-Scream boom”. Yeah, I could buy that. It wasn’t great but it was definitely fun and quirky. Also, my household likes Brittany Murphy. The premise: a serial killer is at work in the town of Cherry Falls. Their schtick? They only kill virgins. (The sherrif comes to this conclusion incredibly quickly with a shockingly small amount of evidence. But he is right.)

The town’s teenagers are urged to lose their virginity as quickly as possible.

If this movie had been made 15 years earlier, it would’ve featured a staggering amount of nudity. In 2000, though, it would have none. I wonder just when that changed!

October 24: Scared Shitless (2024)

A plumber enlists the help of his germaphobic son to go clean and fix some toilets. Little did they know that these toilets… are haunted! It was short and stupid and fun, and I am glad we watched it.

October 25: Death of a Unicorn

I don’t know why this wasn’t good. It felt like they overworked the dough. Everything was there for greatness, but it didn’t pan out. Dude hits a unicorn with his car. The unicorn’s parents come back for revenge. At one point, Richard E. Grant eats a piece of unicorn meat. Anthony Carrigan is in it and, as always, is great.

October 26: Grafted (2024)

Kind of a mess. A young woman with a skin disorder is trying to find a new skin graft technique so she can fit in. But she accidentally kills a few people and it gets worse. It wasn’t bad, but it wasn’t great. Some good Kiwi accents in it, though!

October 28: Boys from County Hell (2020)

This one had been sitting in our Shudder queue for ages! It was worth the wait. In a village in Ireland, there’s a cairn that attracts a very modest number of tourists. The claim is that it’s the grave of an Irish vampire whose story inspired Bram Stoker to write Dracula. The level to which anybody believes this is unclear. Meanwhile, the cairn has to be knocked down to make way for a new motorway. Could this free the evil lurking beneath? Yes.

This was another case of “an unexceptional premise made really well”. The pacing, the casting, the sense of humor, all worked well to make a really enjoyable little movie.

October 29: Good Boy (2025)

Was this whole movie a metaphor? I’m not sure.

There’s a guy who is either sick or suicidal or haunted, and he skips town to go stay at his dead uncle’s house, which even the casual observer can tell is a really bad idea. His sister tells him so, but he’s a jerk and won’t listen. He does take his dog, though. His dog is also worried about him, and the movie is entirely from the perspective of the dog – it’s not all in the dog’s point of view, but we’re following the dog, so we have to work things out from what the dog can see.

The dog was great. The movie was only okay.

Ghosted

Oops, this is a later addition! Sometimes we fill in some days with TV. In the past, this was often when we’d watch American Horror Story. This year, we watched about half of Ghosted, a 2017 show starring Adam Scott and Craig Robinson as an odd couple of kinda-losers who get recruited into a secret government agency that pursues X-files. How had I never heard of this before? And, in fact, nearly nobody I’ve talked to has.

Gloria and I liked it, but it was pretty uneven. The structure of the show takes a hard left turn partway through, and it’s worse for sure.

That’s it!

We watched some other things not reflected here. Most notably, we started watching “It: Welcome to Derry”, which was fine.

I think it was a decent year, and I’ll have to see if I can remember enough about 2024’s movies to write those up too…

plenv-where

blogs.perl.org

A plenv plugin to show which Perl versions have a particular module.

I use plenv daily to manage the many Perl configurations which I use for different projects. Sometimes I have to install huge collections of Perl modules for some specific use case. And then I forget which Perl installation under plenv it was where I installed them.

So I wrote this plugin to fix that.

Example use cases:

$ plenv where Dist::Zilla
5.24.4
5.28.2
5.34.1-dzil
5.39.2

It can also report the actual path and/or the module version:

$ plenv where --path --module-version Dist::Zilla
/[..]versions/5.24.4/lib/perl5/site_perl/5.24.4/Dist/Zilla.pm 6.031
/[..]versions/5.28.2/lib/perl5/site_perl/5.28.2/Dist/Zilla.pm 6.032
/[..]versions/5.34.1-dzil/lib/perl5/site_perl/5.34.1/Dist/Zilla.pm 6.033
/[..]versions/5.39.2/lib/perl5/site_perl/5.39.2/Dist/Zilla.pm 6.030

Configuration

This plugin also uses a configuration file. plenv-where where reads a configuration from file ${XDG_CONFIG_HOME}/plenv/where, or, if the variable XDG_CONFIG_HOME does not exist, from file ${HOME}/.config/plenv/where. In the config file, we place every option on its own line.

Installation

The installation is manual.

mkdir -p "$(plenv root)/plugins"
git clone https://github.com/mikkoi/plenv-where.git "$(plenv root)/plugins/plenv-where"

Unlock PERL Benefits & Rewards — December 2025

Perl on Medium

Discover how to access and benefit from the latest PERL reward program.

PWC 350 Good Substring / Shuffle Pairs

dev.to #perl

Musical Interlude

The movie version of Wicked is in theaters right now, so I am reminded of the song For Good -- but I'm gonna link to the Broadway version, because I'm classy like that. It's relevant to programming in Perl because "I don't know if I've been changed for the better, but I have been changed for good." For part two, Lido Shuffle by Boz Scaggs.

Task 1: Good Substrings

The Task

You are given a string. Write a script to return the number of good substrings of length three in the given string. A string is good if there are no repeated characters.

  • Example 1: Input $str = "abcaefg", Output: 5
    • Good substrings of length 3: abc, bca, cae, aef and efg
  • Example 2: Input: $str = "xyzzabc", Output: 3
  • Example 3: Input: $str = "aababc", Output: 1
  • Example 4: Input: $str = "qwerty", Output: 4
  • Example 5: Input: $str = "zzzaaa", Output: 0

The Think-y Part

There's probably a regular expression for this, but I'm not going to find it. Do the simplest thing that works: take three characters at a time and see if they're different.

The Code-y Part

sub goodSubstring($str)
{
    my $good = 0;
    my @s = split(//, $str);
    for ( 0 .. $#s - 2 )
    {
        my ($first, $second, $third) = @s[$_, $_+1, $_+2];
        $good++ if ( $first ne $second && $first ne $third && $second ne $third );
    }
    return $good;
}

Notes:

  • Start by turning the string into a list of characters. It could be done with substr, but that would be untidy.
  • @s[$_, $_+1, $_+2] -- With a nod to readability, I'll extract three consecutive characters with an array slice. It occurs to me that I'll always have two of the next three characters in hand at the bottom of the loop, so doing a complete splice every time could probably be optimized, but I declare it "good" enough.
  • Since there's exactly three characters in play, check for uniqueness in the most obvious way.

Task 2:Shuffle Pairs

The Task

If two integers A <= B have the same digits but in different orders, we say that they belong to the same shuffle pair if and only if there is an integer k such that A = B * k. k is called the witness of the pair. For example, 1359 and 9513 belong to the same shuffle pair, because 1359 * 7 = 9513.

Interestingly, some integers belong to several different shuffle pairs. For example, 123876 forms one shuffle pair with 371628, and another with 867132, as 123876 * 3 = 371628, and 123876 * 7 = 867132.

Write a function that for a given $from, $to, and $count returns the number of integers $i in the range $from <= $i <= $to that belong to at least $count different shuffle pairs.

  • Example 1:
    • Input: $from = 1, $to = 1000, $count = 1
    • Output: 0
    • There are no shuffle pairs with elements less than 1000.
  • Example 2:

    • Input: $from = 1500, $to = 2500, $count = 1
    • Output: 3
    • There are 3 integers between 1500 and 2500 that belong to shuffle pairs.
      • 1782, the other element is 7128 (witness 4)
      • 2178, the other element is 8712 (witness 4)
      • 2475, the other element is 7425 (witness 3)
  • Example 3:

    • Input: $from = 1_000_000, $to = 1_500_000, $count = 5
    • Output: 2
    • There are 2 integers in the given range that belong to 5 different shuffle pairs.
      • 1428570 pairs with 2857140, 4285710, 5714280, 7142850, and 8571420
      • 1429857 pairs with 2859714, 4289571, 5719428, 7149285, and 8579142
  • Example 4:

    • Input: $from = 13_427_000, $to = 14_100_000, $count = 2
    • Output: 11
    • 6 integers in the given range belong to 3 different shuffle pairs,
    • 5 integers belong to 2 different ones.
  • Example 5:

    • Input: $from = 1030, $to = 1130, $count = 1
    • Output: 2
    • There are 2 integers between 1020 and 1120 that belong to at least one shuffle pair:
      • 1035, the other element is 3105 (witness k = 3)
      • 1089, the other element is 9801 (witness k = 9)

Deliberations

It takes a minute to digest this one.

I first wondered if there's some algebraic number theory trick that would cut the search space way down, but that made my head hurt, so I moved to doing what computers do best: grinding through a lot of possibilities.

A bad first thought was to try all combinations of the digits, but that's going to die an excruciating slow death on the crucifix of combinatorics, not to mention that we'd be completely wasting our time on all but a few combinations.

A better thought is to look only at the multiples of a given number. There are at most 8 multiples of a number in play: the 10th would add a digit, and therefore can't possibly be a reordering. Examples 3 and 4 have a lot of numbers to grind through, but how long can it take, really? It's one banana, Michael; how much could it cost, ten dollars?

How will I decide that a number is a re-ordering? I think I'll reduce each number to a canonical form where the digits are sorted, then use string compare to see if a multiple has the same canonical form.

To the bat-editor, Robin!

First, a little function to turn a number into a canonical form with its digits in sorted order. Turn the number into a list of digits, sort, and then join the digits back into a string.

sub canonical($n)
{
    join("", sort split(//, $n));
}

Now the main course. I'll want to examine every number in the range $from to $to, inclusive. For each number, I'll want to examine its multiples to see if they have the same digits. I need to count the ones that work so that I can check that there are at least $count of them.

sub shufflePair($from, $to, $count)
{
    my $answer = 0;

    for my $n ( $from .. $to )
    {
        my $base = canonical($n);
        my $max = (9 x length($n))+0;
        my $pair = 0;
        for ( 2 .. 9 )
        {
            my $multiple = $n * $_;

            next if $multiple > $max
                 || index($base, substr($multiple,  0, 1)) < 0
                 || index($base, substr($multiple, -1, 1)) < 0;

            if ( canonical($multiple) eq $base )
            {
                $pair++;
            }
        }
        $answer++ if $pair >= $count;
    }
    return $answer;
}

Notes:

  • my $base = canonical($n) -- hang on to this for comparison.
  • my $max = (9 x length($n))+0; -- An optimization. The maximum number we need to be concerned with is one that has the same number of digits, but is all 9s. For example, if $n is 480, then we are dealing with 3-digit numbers, so the largest possible is 999. That's less than 480*3=1440, so we don't have to examine any of the multiples beyond 480*2.
  • for ( 2..9 ) -- These are the only multiples of $n that could possibly have the same number of digits.
  • next if ... -- Besides the check on $max, we can make cheap checks on a single digit. If the first or last digit isn't one of the possible digits, we can avoid the overhead of canonical(), which isn't horrendous, but it does involve allocating lists and a sort.
  • canonical($multiple) eq $base -- This string compare is where we decide if we have a shuffle pair.
  • $answer++ if $pair >= $count -- We increment the answer if this number has at least $count shuffle pairs.

This solution takes a few seconds to run the examples. My optimizations to bail early in many cases cut the run time approximately in half (from about 8 seconds to about 4.5).

PAGI: ASGI For Perl, or the Spiritual Successor to Plack

dev.to #perl

Introducing PAGI: Async Web Development for Perl

TL;DR: PAGI (Perl Asynchronous Gateway Interface) is a new specification for async Perl web applications, inspired by Python's ASGI. It supports HTTP, WebSockets, and Server-Sent Events natively, and can wrap existing PSGI applications for backward compatibility.

The Problem

Modern web applications need more than traditional request-response cycles. Real-time features like live notifications, collaborative editing, and streaming data require persistent connections. This means:

  • WebSockets for bidirectional communication
  • Server-Sent Events for efficient server push
  • Streaming responses for large payloads
  • Connection lifecycle management for resource pooling

PSGI, Perl's venerable web server interface, assumes a synchronous world. While frameworks like Mojolicious have built async capabilities on top, there's no shared standard that allows different async frameworks and servers to interoperate.

PAGI aims to fill that gap.

What is PAGI?

PAGI defines a standard interface between async-capable Perl web servers and applications. If you're familiar with Python's ecosystem, think of it as Perl's answer to ASGI.

A PAGI application is an async coderef with three parameters:

use Future::AsyncAwait;
use experimental 'signatures';

async sub app ($scope, $receive, $send) {
    # $scope   - connection metadata (type, headers, path, etc.)
    # $receive - async coderef to get events from the client
    # $send    - async coderef to send events to the client
}

The $scope->{type} tells you what kind of connection you're handling:

Type Description
http Standard HTTP request/response
websocket Persistent WebSocket connection
sse Server-Sent Events stream
lifespan Application startup/shutdown lifecycle

A Simple HTTP Example

Here's "Hello World" in raw PAGI:

use Future::AsyncAwait;

async sub app ($scope, $receive, $send) {
    die "Unsupported: $scope->{type}" if $scope->{type} ne 'http';

    await $send->({
        type    => 'http.response.start',
        status  => 200,
        headers => [['content-type', 'text/plain']],
    });

    await $send->({
        type => 'http.response.body',
        body => 'Hello from PAGI!',
        more => 0,
    });
}

Run it:

pagi-server --app app.pl --port 5000
curl http://localhost:5000/
# => Hello from PAGI!

The response is split into http.response.start (headers) and http.response.body (content). This separation enables streaming—send multiple body chunks with more => 1 before the final more => 0.

WebSocket Support

WebSockets are first-class citizens in PAGI:

async sub app ($scope, $receive, $send) {
    if ($scope->{type} eq 'websocket') {
        await $send->({ type => 'websocket.accept' });

        while (1) {
            my $event = await $receive->();

            if ($event->{type} eq 'websocket.receive') {
                my $msg = $event->{text} // $event->{bytes};
                await $send->({
                    type => 'websocket.send',
                    text => "Echo: $msg",
                });
            }
            elsif ($event->{type} eq 'websocket.disconnect') {
                last;
            }
        }
    }
    else {
        die "Unsupported: $scope->{type}";
    }
}

The event loop pattern is consistent across all connection types: await events from $receive, send responses via $send.

PSGI Compatibility

One of PAGI's key features is backward compatibility with PSGI. The PAGI::App::WrapPSGI adapter lets you run existing PSGI applications on a PAGI server:

use PAGI::App::WrapPSGI;

# Your existing Catalyst/Dancer/Plack app
my $psgi_app = MyApp->psgi_app;

my $wrapper = PAGI::App::WrapPSGI->new(psgi_app => $psgi_app);
$wrapper->to_app;

The wrapper handles all the translation: building %env from PAGI scope, collecting request bodies, and converting responses back to PAGI events.

This means you can:

  • Run legacy applications on a PAGI server
  • Add WebSocket endpoints alongside existing routes
  • Migrate incrementally from PSGI to PAGI
  • Share connection pools between old and new code

PAGI::Simple Micro-Framework

For rapid development, PAGI ships with a micro-framework inspired by Express.js:

use PAGI::Simple;

my $app = PAGI::Simple->new(name => 'My API');

$app->get('/' => sub ($c) {
    $c->text('Hello, World!');
});

$app->get('/users/:id' => sub ($c) {
    my $id = $c->path_params->{id};
    $c->json({ user_id => $id });
});

$app->post('/api/data' => sub ($c) {
    my $data = $c->json_body;
    $c->json({ received => $data, status => 'ok' });
});

$app->to_app;

WebSockets are equally clean:

$app->websocket('/chat' => sub ($ws) {
    $ws->on(message => sub ($data) {
        $ws->broadcast("Someone said: $data");
    });
});

PAGI::Simple includes:

  • Express-style routing with path parameters
  • JSON request/response helpers
  • Session management
  • Middleware support (CORS, logging, rate limiting, etc.)
  • Static file serving
  • WebSocket rooms and broadcasting
  • SSE channels with pub/sub

Current Status

PAGI is currently in early beta. The test suite passes, the examples work, but it hasn't been battle-tested in production.

What exists today:

  • Complete PAGI specification
  • Reference server implementation (PAGI::Server)
  • PAGI::Simple micro-framework
  • 13 example applications
  • PSGI compatibility layer
  • 483 passing tests

What it needs:

  • Developers willing to experiment and provide feedback
  • Real-world testing
  • Framework authors interested in building on PAGI
  • Performance profiling and optimization

Getting Started

git clone https://github.com/jjn1056/pagi.git
cd pagi
cpanm --installdeps .
prove -l t/

# Try the examples
pagi-server --app examples/01-hello-http/app.pl --port 5000
pagi-server --app examples/simple-01-hello/app.pl --port 5000

Why This Matters

Perl has excellent async primitives (IO::Async, Future::AsyncAwait), but no shared specification for async web applications. Each framework implements its own approach, which limits interoperability.

PAGI provides that shared foundation. By standardizing on a common interface:

  • Servers can focus on performance and protocol handling
  • Frameworks can focus on developer experience
  • Middleware becomes portable across implementations
  • The ecosystem can grow together rather than in isolation

If you're interested in the future of async Perl web development, I'd love your feedback. Check out the repository, try the examples, and let me know what you think.

Repository: github.com/jjn1056/pagi

PAGI is not yet on CPAN. It's experimental software—please don't use it in production unless you really know what you're doing.

A language awakens the moment its community shares what it has lived and built.

LPW 2025 - Event Report

blogs.perl.org


I attended the London Perl & Raku Workshop 2025 last Saturday. Please find the detailed event report: https://theweeklychallenge.org/blog/lpw-2025

Perl 🐪 Weekly #749 - Design Patterns in Modern Perl

dev.to #perl

Originally published at Perl Weekly 749

Hi there!

The big announcement is that Mohammad Sajid Anwar who runs The Weekly Challenge and who is the other editor of the Perl Weekly newsletter, has published his first book called Design Patterns in Modern Perl. You can buy it both on Amazon and on Leanpub. Leanpub gives you the option to change the price so you can also use this opportunity to give a one-time donation to him. As far as I know, Leanpub also gives a much bigger part of the price to the author than Amazon does. You can also ask them to send the book to your Kindle or you can upload it yourself. I already bought it and started to read it. Now you go, buy the book!

In just a few hours we are going to have the online meeting Perl Code-reading and testing. You can still register here.

Perl on WhatsApp: I am part of a lot of WhatsApp groups about Python and Rust and other non-tech stuff. I figured I could create one for Perl as well. If you are interested join here. There are also two groups on Telegram. One is called Perl 5 that has 141 members and the other one is called Perl Maven community that I created, because I did not know about the other one. The latter has 59 members. You are invited to join any or all of these channels.

I started a poll in the Perl Community Facebook group. There are already 63 votes. It would be nice if you answered too.

Enjoy your week!

--
Your editor: Gabor Szabo.

Announcements

Design Patterns in Modern Perl

Manwar, congratulations! Everyone else, go buy the book! (comments)

Articles

ANNOUNCE: Various updated wikis, including Perl.Wiki

Dotcom Survivor Syndrome – How Perl’s Early Success Created the Seeds of Its Downfall

I like the sentiment, but as one of the commenters pointed out there was PHP as well.

GitHub and the Perl License

In a nutshell, if you'd like to use 'the Perl license' you probably should include two separate license files. (comments)

Showcase: Localised JSON Schema validation in Perl + JavaScript (CodePen demo included!)

A small project that might interest anyone in dealing with form validation, localisation, and JSON Schema in their Perl web applications / REST API.

Web

Catalyst::Request body issues with the file position pointer

For those using the Perl Catalyst web framework in ways involving structured request bodies (e.g. API POSTs)...

The Weekly Challenge

The Weekly Challenge by Mohammad Sajid Anwar will help you step out of your comfort-zone. You can even win prize money of $50 by participating in the weekly challenge. We pick one champion at the end of the month from among all of the contributors during the month, thanks to the sponsor Lance Wicks.

The Weekly Challenge - 350

Welcome to a new week with a couple of fun tasks "Good Substrings" and "Shuffle Pairs". If you are new to the weekly challenge then why not join us and have fun every week. For more information, please read the FAQ.

RECAP - The Weekly Challenge - 349

Enjoy a quick recap of last week's contributions by Team PWC dealing with the "Power String" and "Meeting Point" tasks in Perl and Raku. You will find plenty of solutions to keep you busy.

TWC349

Both solutions use a straightforward, single-pass approach that is perfectly suited for the problem. They process the input string only once, making them very efficient (O(n)) and demonstrating a solid grasp of fundamental algorithmic thinking.

Power Pointing

The post is an excellent, practical demonstration of Raku's expressiveness and built-in functionality. It successfully showcases how Raku allows a programmer to transition from a straightforward, imperative approach to a concise, idiomatic, and highly readable functional solution.

More complex than it has to be

The post presents a fascinating and honest case study of over-engineering. Bob deliberately explores a complex, "enterprise-grade" solution to a simple problem, contrasting it with the obvious simple solution.

Meeting Strings

This is an exceptionally well-crafted post. It demonstrates a deep understanding of Raku's idioms and standard library, transforming simple problems into masterclasses in concise, expressive, and functional programming.

moving and grepping

This is an exemplary post that demonstrates exceptional technical breadth, deep practical knowledge, and a clear, effective pedagogical style. It transcends being a mere solution set and serves as a masterclass in polyglot programming and database extensibility.

Perl Weekly Challenge 349

The post demonstrates both deep Perl knowledge and strong pedagogical skills, making complex solutions accessible while showcasing advanced language features.

Power Meets Points

This post demonstrates expert-level Perl programming with deep language knowledge and thoughtful engineering considerations. Matthias combines elegant solutions with practical performance analysis.

The Power of String

This is a high-quality technical post that successfully demonstrates how to solve the same problems in multiple programming languages while maintaining algorithmic consistency.

Powering to the origin

This post demonstrates creative problem-solving with elegant regex decrementing for Task 1 and a clever eval-based dispatch system for Task 2. Peter shows strong analytical thinking by carefully distinguishing between final-position and intermediate-position checks, and makes practical engineering trade-offs between cleverness and performance.

The Weekly Challenge #349

This is a well-structured, professional-grade solution with excellent documentation and robust code organization. Robbie demonstrates strong analytical thinking by carefully addressing potential ambiguities in the problem statement and explicitly warning against common algorithmic pitfalls.

Power Meeting

Roger demonstrates strong analytical skills by questioning the problem statement itself and providing robust solutions for different interpretations, showing both practical implementation skills and deeper algorithmic thinking.

Power Meeting

This is a clean, practical, and well-explained approach to the weekly challenges. Simon demonstrates strong fundamentals with a clear, step-by-step problem-solving methodology.

Weekly collections

NICEPERL's lists

Great CPAN modules released last week.

Events

Perl Maven online: Code-reading and testing

December 1, 2025

Toronto.pm - online - How SUSE is using Perl

December 6, 2025

Paris.pm monthly meeting

December 10, 2025

German Perl/Raku Workshop 2026 in Berlin

March 16-18, 2025

You joined the Perl Weekly to get weekly e-mails about the Perl programming language and related topics.

Want to see more? See the archives of all the issues.

Not yet subscribed to the newsletter? Join us free of charge!

(C) Copyright Gabor Szabo
The articles are copyright the respective authors.

Get them from My Wiki Haven and Symboliciq.au. Details:


  • Perl.Wiki V 1.35

  • Mojolicious Wiki V 1.10

  • Debian Wiki V 1.11

  • (New) PHP Wiki V 1.01

  • Symbolic Language Wiki V 1.18 (at symboliciq.au)

If you were building web applications during the first dot-com boom, chances are you wrote Perl. And if you’re now a CTO, tech lead, or senior architect, you may instinctively steer teams away from it—even if you can’t quite explain why.

This reflexive aversion isn’t just a preference. It’s what I call Dotcom Survivor Syndrome: a long-standing bias formed by the messy, experimental, high-pressure environment of the early web, where Perl was both a lifeline and a liability.

Perl wasn’t the problem. The conditions under which we used it were. And unfortunately, those conditions, combined with a separate, prolonged misstep over versioning, continue to distort Perl’s reputation to this day.


The Glory Days: Perl at the Heart of the Early Web

In the mid- to late-1990s, Perl was the web’s duct tape.

  • It powered CGI scripts on Apache servers.

  • It automated deployments before DevOps had a name.

  • It parsed logs, scraped data, processed form input, and glued together whatever needed glueing.

Perl 5, released in 1994, introduced real structure: references, modules, and the birth of CPAN, which became one of the most effective software ecosystems in the world.

Perl wasn’t just part of the early web—it was instrumental in creating it.


The Dotcom Boom: Shipping Fast and Breaking Everything

To understand the long shadow Perl casts, you have to understand the speed and pressure of the dot-com boom.

We weren’t just building websites.
We were inventing how to build websites.

Best practices? Mostly unwritten.
Frameworks? Few existed.
Code reviews? Uncommon.
Continuous integration? Still a dream.

The pace was frantic. You built something overnight, demoed it in the morning, and deployed it that afternoon. And Perl let you do that.

But that same flexibility—its greatest strength—became its greatest weakness in that environment. With deadlines looming and scalability an afterthought, we ended up with:

  • Thousands of lines of unstructured CGI scripts

  • Minimal documentation

  • Global variables everywhere

  • Inline HTML mixed with business logic

  • Security holes you could drive a truck through

When the crash came, these codebases didn’t age gracefully. The people who inherited them, often the same people who now run engineering orgs, remember Perl not as a powerful tool, but as the source of late-night chaos and technical debt.


Dotcom Survivor Syndrome: Bias with a Backstory

Many senior engineers today carry these memories with them. They associate Perl with:

  • Fragile legacy systems

  • Inconsistent, “write-only” code

  • The bad old days of early web development

And that’s understandable. But it also creates a bias—often unconscious—that prevents Perl from getting a fair hearing in modern development discussions.


Version Number Paralysis: The Perl 6 Effect

If Dotcom Boom Survivor Syndrome created the emotional case against Perl, then Perl 6 created the optical one.

In 2000, Perl 6 was announced as a ground-up redesign of the language. It promised modern syntax, new paradigms, and a bright future. But it didn’t ship—not for a very long time.

In the meantime:

  • Perl 5 continued to evolve quietly, but with the implied expectation that it would eventually be replaced.

  • Years turned into decades, and confusion set in. Was Perl 5 deprecated? Was Perl 6 compatible? What was the future of Perl?

To outsiders—and even many Perl users—it looked like the language was stalled. Perl 5 releases were labelled 5.8, 5.10, 5.12… but never 6. Perl 6 finally emerged in 2015, but as an entirely different language, not a successor.

Eventually, the community admitted what everyone already knew: Perl 6 wasn’t Perl. In 2019, it was renamed Raku.

But the damage was done. For nearly two decades, the version number “6” hung over Perl 5 like a storm cloud – a constant reminder that its future was uncertain, even when that wasn’t true.

This is what I call Version Number Paralysis:

  • A stalled major version that made the language look obsolete.

  • A missed opportunity to signal continued relevance and evolution.

  • A marketing failure that deepened the sense that Perl was a thing of the past.

Even today, many developers believe Perl is “stuck at version 5,” unaware that modern Perl is actively maintained, well-supported, and quite capable.

While Dotcom Survivor Syndrome left many people with an aversion to Perl, Version Number Paralysis gave them an excuse not to look closely at Perl to see if it had changed.


What They Missed While Looking Away

While the world was confused or looking elsewhere, Perl 5 gained:

  • Modern object systems (Moo, Moose)

  • A mature testing culture (Test::More, Test2)

  • Widespread use of best practices (Perl::Critic, perltidy, etc.)

  • Core team stability and annual releases

  • Huge CPAN growth and refinements

But those who weren’t paying attention, especially those still carrying dotcom-era baggage, never saw it. They still think Perl looks like it did in 2002.


Can We Move On?

Dotcom Survivor Syndrome is real. So is Version Number Paralysis. Together, they’ve unfairly buried a language that remains fast, expressive, and battle-tested.

We can’t change the past. But we can:

  • Acknowledge the emotional and historical baggage

  • Celebrate the role Perl played in inventing the modern web

  • Educate developers about what Perl really is today

  • Push back against the assumption that old == obsolete


Conclusion

Perl’s early success was its own undoing. It became the default tool for the first web boom, and in doing so, it took the brunt of that era’s chaos. Then, just as it began to mature, its versioning story confused the industry into thinking it had stalled.

But the truth is that modern Perl is thriving quietly in the margins – maintained by a loyal community, used in production, and capable of great things.

The only thing holding it back is a generation of developers still haunted by memories of CGI scripts, and a version number that suggested a future that never came.

Maybe it’s time we looked again.

The post Dotcom Survivor Syndrome – How Perl’s Early Success Created the Seeds of Its Downfall first appeared on Perl Hacks.

I was writing a data intensive code in Perl relying heavily on PDL for some statitical calculations (estimation of percentile points in some very BIG vectors, e.g. 100k to 1B elements), when I noticed that PDL was taking a very (and unusually long!) time to produce results compared to my experience in Python. This happened irrespective of whether one used the pct or oddpct functions in PDL::Ufunc.

The performance degradation had a very interesting quantitative aspect: if one asked PDL to return a single percentile it did so very fast; but if one were to ask for more than one percentiles, the time-to-solution increased linearly with the number of percentiles specified. Looking at the source code of the pct function, it seems that it is implemented by calling the function pctover, which according to the PDL documentation “Broadcasts over its inputs.

But what is exactly broadcasting? According to PDL::Broadcasting : “[broadcasting] can produce very compact and very fast PDL code by avoiding multiple nested for loops that C and BASIC users may be familiar with. The trouble is that it can take some getting used to, and new users may not appreciate the benefits of broadcasting.” Reading the relevant PDL examples and revisiting the NumPy documentation (which also uses this technique), broadcasting : treats arrays with different shapes during arithmetic operations. Subject to certain constraints, the smaller array is “broadcast” across the larger array so that they have compatible shapes. Broadcasting provides a means of vectorizing array operations so that looping occurs in C instead of Python..

It seems that when one does something like:

use PDL::Lite;
my $very_big_ndarray = ... ; # code that constructs a HUGE PDL ndarrat
my $pct              = sequence(100)/100;   # all percentiles from 0 to 100%
my $pct_values       = pct( $very_big_ndarray, $pct);

the broadcasting effectively executes sequentially the code for calculating a single percentile and concatenates the results.

The problem with broadcasting for this operation is that the percentile calculation includes a VERY expensive operation, namely the sorting of the $very_big_darray before the (trivial) calculation of the percentile from the sorted values as detailed in Wikipedia. So when the percentile operation is broadcast by PDL, the sorting is repeated for each percentile value in $pct, leading to catastrophic loss of performance!

How can we fix this? It turns out to be reasonably trivial : we need to reimplement the percentile function so that it does not broadcast. One of the simplest quantile functions to implement, is the one based on the empirical cumulative distribution function (this corresponds to the Type 3 quantile in the classification by Hyndman and Fan). This one can be trivially implemented in Perl using PDL as:

sub quantile_type_3 {
    my ( $data, $pct ) = @_;
    my $sorted_data = $data->qsort;
    my $nelem       = $data->nelem;
    my $cum_ranks   = floor( $pct * $nelem );
    $sorted_data->index($cum_ranks);
}

(The other quantiles can be implemented equally trivially using affine operations as explained in R’s documentation of the quantile function).

To see how well this works, I wrote a Perl benchmark script that benchmarks the builtin function pct, the quantile_type_3 function on synthetic data and then calls the companion R script to profile the 9 quantile functions and the 3 sort functions in R for the same dataset.

I obtained the following performance figures in my old Xeon: the “de-broadcasted” version of the quantile function achieves the same performance as the R implementations, whereas the PDL broadcasting version is 100 times slower.

Test Iterations Elements Quantiles Elapsed Time (s)
pct 10 1000000 100 132.430000
quantile_type_3 10 1000000 100 1.320000
pct_R_1 10 1000000 100 1.290000
pct_R_2 10 1000000 100 1.281000
pct_R_3 10 1000000 100 1.274000
pct_R_4 10 1000000 100 1.283000
pct_R_5 10 1000000 100 1.290000
pct_R_6 10 1000000 100 1.286000
pct_R_7 10 1000000 100 1.233000
pct_R_8 10 1000000 100 1.309000
pct_R_9 10 1000000 100 1.291000
sort_quick 10 1000000 100 1.220000
sort_shell 10 1000000 100 1.758000
sort_radix 10 1000000 100 0.924000

As can be seen from the table, the sorting operations account mostly for the bulk of the execution time of the quantile functions.

Two major takehome points: 1) don’t be afraid to look under the hood/inside the blackbox when performance is surprisingly disappointing! 2) be careful of broadcasting operations in PDL, NumPy, or Matlab.

(dlxxvi) 8 great CPAN modules released last week

Niceperl
Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::Netdisco - An open source web-based network management tool.
    • Version: 2.095004 on 2025-11-23, with 798 votes
    • Previous CPAN version: 2.095003 was 4 days before
    • Author: OLIVER
  2. CPANSA::DB - the CPAN Security Advisory data as a Perl data structure, mostly for CPAN::Audit
    • Version: 20251123.001 on 2025-11-23, with 25 votes
    • Previous CPAN version: 20251116.001 was 7 days before
    • Author: BRIANDFOY
  3. Cucumber::TagExpressions - A library for parsing and evaluating cucumber tag expressions (filters)
    • Version: 8.1.0 on 2025-11-26, with 16 votes
    • Previous CPAN version: 8.0.0 was 1 month, 11 days before
    • Author: CUKEBOT
  4. JSON::Schema::Modern - Validate data against a schema using a JSON Schema
    • Version: 0.625 on 2025-11-28, with 14 votes
    • Previous CPAN version: 0.624 was 2 days before
    • Author: ETHER
  5. Mail::Box - complete E-mail handling suite
    • Version: 3.012 on 2025-11-27, with 16 votes
    • Previous CPAN version: 3.011 was 7 months, 8 days before
    • Author: MARKOV
  6. meta - meta-programming API
    • Version: 0.015 on 2025-11-28, with 14 votes
    • Previous CPAN version: 0.014 was 2 months, 24 days before
    • Author: PEVANS
  7. Type::Tiny - tiny, yet Moo(se)-compatible type constraint
    • Version: 2.008006 on 2025-11-26, with 146 votes
    • Previous CPAN version: 2.008005 was 5 days before
    • Author: TOBYINK
  8. Workflow - Simple, flexible system to implement workflows
    • Version: 2.09 on 2025-11-23, with 34 votes
    • Previous CPAN version: 2.08 was 10 days before
    • Author: JONASBN

GitHub and the Perl License

blogs.perl.org

When we publish our Perl module repository on GitHub, we might notice something peculiar in the "About" section of our repository: GitHub doesn't recognize the Perl 5 license. This can be a bit confusing, especially when we've explicitly stated the licensing in our LICENSE file.

Without properly defined license, GitHub ranks the quality of a repository lower. This is also unfortunate because it limits the "searchability" of our repository. GitHub cannot index it according to the license and users cannot search by license. This is today more important than ever before as many enterprises rule out open source projects purely on the grounds that their license is poorly managed.

The Problem: Two Licenses in One File

The standard Perl 5 license, as used by many modules, is a dual license: Artistic License (2.0) and GNU General Public License (GPL) version 1 or later. Often, this is included in a single LICENSE file in the repository root.

GitHub's license detection mechanism, powered by Licensee, is designed to identify a single, clear license. When it encounters a file with two distinct licenses concatenated, it fails to make a definitive identification.

Here's an example of a repository where GitHub doesn't recognize the license. Notice the missing license badge in the "About" section:

github-licenses-not-visible.png

Also the "quick select" banner above Readme file does not acknowledge which license there is. github-licenses-not-visible-bottom-bar.png

The Solution: Separate License Files

The simplest and most effective solution is to provide each license in its own dedicated file. This allows Licensee to easily identify and display both licenses. This is perfectly valid because the Perl 5 license explicitly allows for distribution under either the Artistic License or the GPL. Providing both licenses separately simply makes it clearer which licenses apply and how they are presented.

(The other reason for having multiple licenses is situation where different parts of the repository are under different licenses. But this is not our problem here.)

For example, instead of a single LICENSE file containing both, we would have:

  • LICENSE-Artistic-2.0
  • LICENSE-GPL-3

Let's look at an example from my own env-assert repository. In this repository, I've separated the licenses into LICENSE-Artistic-2.0 and LICENSE-GPL-3.

And here's how GitHub's "About" section looks for env-assert, clearly recognizing both licenses:

github-licenses-visible.png

As we can see, GitHub now correctly identifies "Artistic-2.0" and "GPL-3.0" as the licenses for the project.

Same is also visible in the "quick select" bar:

github-licenses-visible-bottom-bar.png

Automating with Software::Policies and Dist::Zilla::Plugin::Software::Policies

Manually creating and maintaining these separate license files for every module can be tedious. Fortunately, there is a way to automate this process if you are using Dist::Zilla for authoring.

Dist::Zilla::Plugin::Software::Policies

If we're using Dist::Zilla for our module authoring, Dist-Zilla-Plugin-Software-Policies can automatically check that we have the correct License files. It uses Dist::Zilla's internal variable licence to determine the correct license files.

The Dist::Zilla plugin uses Software-Policies as a backend to do the heavy lifting.

Software::Policies

Software::Policies is a module that provides a framework for defining and enforcing software policies, including licensing. It comes with a pre-defined policy for Perl 5's double license. It can also generate other policy files, such as CONTRIBUTING.md, CODE_OF_CONDUCT.md, and SECURITY.md.

By using Software::Policies, we can programmatically check for the presence and content of our license files.

This approach not only solves the GitHub license detection problem but also helps us maintain consistent and correct licensing across all our Perl modules, integrating it directly into our build workflow.

By configuring this plugin in our dist.ini, we can ensure that our distribution always includes the correct and properly formatted license files, making GitHub (and other license scanners) happy.

Here's a simplified example of how we might configure it in our dist.ini:

[Software::Policies / License]
policy_attribute = perl_5_double_license = true

[Test::Software::Policies]
include_policy = License

This configuration tells Dist::Zilla plugin Test::Software::Policies to apply the Perl licensing policy, which typically means Artistic License 2.0 and GPL. When we build our distribution with Dist::Zilla, the plugin will create a test file checks for the existence and content of the LICENSE-Artistic-2.0 and LICENSE-GPL-3 files. During testing phase, when running dzil test or dzil release, the test files will be run and if the license files are missing or incorrect, the tests will fail.

To generate the files, we can run the command dzil policies License or just dzil policies. This will create the files according to config in dist.ini, the [Software::Policies / License] part of dist.ini.

We cannot create the files automatically during build because then they will only be included in the release, not in the repository. It is precisely in the repository that we need them for GitHub's sake. So the process to create or update the license files has to have this small manual stage.

Perlevescava Rewards & Bonus Guide — November 2025

Perl on Medium

Discover all the ways to unlock rewards and bonuses with Perlevescava.

This week in PSC (208) | 2025-11-25

blogs.perl.org

All three of us met.

  • The refalias draft PPC on the mailing list is looking good. We encourage Gianni to turn it into a full PPC doc PR
  • We still would like more automation around making real CPAN distributions out of dist/ dirs. Paul will write an email requesting assistance on that subject specifically
  • Briefly discussed the subject of the meta module handling signatures with named parameters. Further discussion will continue on the email thread.

[P5P posting of this summary]

Discover all the ways to unlock rewards and bonuses with Perlevescava.

Elderly Camels in the Cloud

Perl Hacks

In last week’s post I showed how to run a modern Dancer2 app on Google Cloud Run. That’s lovely if your codebase already speaks PSGI and lives in a nice, testable, framework-shaped box.

But that’s not where a lot of Perl lives.

Plenty of useful Perl on the internet is still stuck in old-school CGI – the kind of thing you’d drop into cgi-bin on a shared host in 2003 and then try not to think about too much.

So in this post, I want to show that:

If you can run a Dancer2 app on Cloud Run, you can also run ancient CGI on Cloud Run – without rewriting it.

To keep things on the right side of history, we’ll use nms FormMail rather than Matt Wright’s original script, but the principle is exactly the same.


Prerequisites: Google Cloud and Cloud Run

If you already followed the Dancer2 post and have Cloud Run working, you can skip this section and go straight to “Wrapping nms FormMail in PSGI”.

If not, here’s the minimum you need.

  1. Google account and project

    • Go to the Google Cloud Console.

    • Create a new project (e.g. “perl-cgi-cloud-run-demo”).

  2. Enable billing

    • Cloud Run is pay-as-you-go with a generous free tier, but you must attach a billing account to your project.

  3. Install the gcloud CLI

    • Install the Google Cloud SDK for your platform.

    • Run:

      gcloud init

      and follow the prompts to:

      • log in

      • select your project

      • pick a default region (I’ll assume “europe-west1” below).

  4. Enable required APIs

    In your project:

    gcloud services enable \
    run.googleapis.com \
    artifactregistry.googleapis.com \
    c  loudbuild.googleapis.com
  5. Create a Docker repository in Artifact Registry

    gcloud artifacts repositories create formmail-repo \
    --repository-format=docker \
    --location=europe-west1 \
    --description="Docker repo for CGI demos"

That’s all the GCP groundwork. Now we can worry about Perl.


The starting point: an old CGI FormMail

Our starting assumption:

  • You already have a CGI script like nms FormMail

  • It’s a single “.pl” file, intended to be dropped into “cgi-bin”

  • It expects to be called via the CGI interface and send mail using:

open my $mail, '|-', '/usr/sbin/sendmail -t'
or die "Can't open sendmail: $!";

On a traditional host, Apache (or similar) would:

  • parse the HTTP request

  • set CGI environment variables (REQUEST_METHOD, QUERY_STRING, etc.)

  • run formmail.pl as a process

  • let it call /usr/sbin/sendmail

Cloud Run gives us none of that. It gives us:

  • a HTTP endpoint

  • backed by a container

  • listening on a port ($PORT)

Our job is to recreate just enough of that old environment inside a container.

We’ll do that in two small pieces:

  1. A PSGI wrapper that emulates CGI.

  2. A sendmail shim so the script can still “talk” sendmail.


Architecture in one paragraph

Inside the container we’ll have:

  • nms FormMail – unchanged CGI script at /app/formmail.pl

  • PSGI wrapper (app.psgi) – using CGI::Compile and CGI::Emulate::PSGI

  • Plack/Starlet – a simple HTTP server exposing app.psgi on $PORT

  • msmtp-mta – providing /usr/sbin/sendmail and relaying mail to a real SMTP server

Cloud Run just sees “HTTP service running in a container”. Our CGI script still thinks it’s on a early-2000s shared host.


Step 1 – Wrapping nms FormMail in PSGI

First we write a tiny PSGI wrapper. This is the only new Perl we need:

# app.psgi

use strict;
use warnings;

use CGI::Compile;
use CGI::Emulate::PSGI;

# Path inside the container
my $cgi_script = "/app/formmail.pl";

# Compile the CGI script into a coderef
my $cgi_app = CGI::Compile->compile($cgi_script);

# Wrap it in a PSGI-compatible app
my $app = CGI::Emulate::PSGI->handler($cgi_app);

# Return PSGI app
$app;

That’s it.

  • CGI::Compile loads the CGI script and turns its main package into a coderef.

  • CGI::Emulate::PSGI fakes the CGI environment for each request.

  • The CGI script doesn’t know or care that it’s no longer being run by Apache.

Later, we’ll run this with:

plackup -s Starlet -p ${PORT:-8080} app.psgi

Step 2 – Adding a sendmail shim

Next problem: Cloud Run doesn’t give you a local mail transfer agent.

There is no real /usr/sbin/sendmail, and you wouldn’t want to run a full MTA in a stateless container anyway.

Instead, we’ll install msmtp-mta, a light-weight SMTP client that includes a sendmail-compatible wrapper. It gives you a /usr/sbin/sendmail binary that forwards mail to a remote SMTP server (Mailgun, SES, your mail provider, etc.).

From the CGI script’s point of view, nothing changes:

open my $mail, '|-', '/usr/sbin/sendmail -t'
  or die "Can't open sendmail: $!";
# ... write headers and body ...
close $mail;

Under the hood, msmtp ships it off to your configured SMTP server.

We’ll configure msmtp from environment variables at container start-up, so Cloud Run’s --set-env-vars values are actually used.

Step 3 – Dockerfile (+ entrypoint) for Perl, PSGI and sendmail shim

Here’s a complete Dockerfile that pulls this together.

FROM perl:5.40

# Install msmtp-mta as a sendmail-compatible shim
RUN apt-get update && \
    apt-get install -y --no-install-recommends msmtp-mta ca-certificates && \
    rm -rf /var/lib/apt/lists/*

# Install Perl dependencies
RUN cpanm --notest \
    CGI::Compile \
    CGI::Emulate::PSGI \
    Plack \
    Starlet

WORKDIR /app

# Copy nms FormMail (unchanged) and the PSGI wrapper
COPY formmail.pl app.psgi /app/
RUN chmod 755 /app/formmail.pl

# Entrypoint script that:
# 1. writes /etc/msmtprc from environment variables
# 2. starts the PSGI server
COPY docker-entrypoint.sh /usr/local/bin/docker-entrypoint.sh
RUN chmod +x /usr/local/bin/docker-entrypoint.sh

ENV PORT=8080

EXPOSE 8080

CMD ["docker-entrypoint.sh"]

And here’s the docker-entrypoint.sh script:

#!/bin/sh

set -e

# Reasonable defaults

: "${MSMTP_ACCOUNT:=default}"
: "${MSMTP_PORT:=587}"

if [ -z "$MSMTP_HOST" ] || [ -z "$MSMTP_USER" ] || [ -z "$MSMTP_PASSWORD" ] || [ -z "$MSMTP_FROM" ]; then
  echo "Warning: MSMTP_* environment variables not fully set; mail probably won't work." >&2
fi

cat > /etc/msmtprc <<EOF
defaults
auth           on
tls            on
tls_trust_file /etc/ssl/certs/ca-certificates.crt
logfile        /var/log/msmtp.log

account  ${MSMTP_ACCOUNT}
host     ${MSMTP_HOST}
port     ${MSMTP_PORT}
user     ${MSMTP_USER}
password ${MSMTP_PASSWORD}
from     ${MSMTP_FROM}

account default : ${MSMTP_ACCOUNT}
EOF

chmod 600 /etc/msmtprc

# Start the PSGI app
exec plackup -s Starlet -p "${PORT:-8080}" app.psgi

Key points you might want to note:

  • We never touch formmail.pl. It goes into /app and that’s it.

  • msmtp gives us /usr/sbin/sendmail, so the CGI script stays in its 1990s comfort zone.

  • The entrypoint writes /etc/msmtprc at runtime, so Cloud Run’s environment variables are actually used.


Step 4 – Building and pushing the image

With the Dockerfile and docker-entrypoint.sh in place, we can build and push the image to Artifact Registry.

I’ll assume:

  • Project ID: PROJECT_ID

  • Region: europe-west1

  • Repository: formmail-repo

  • Image name: nms-formmail

First, build the image locally:

docker build -t europe-west1-docker.pkg.dev/PROJECT_ID/formmail-repo/nms-formmail:latest .

Then configure Docker to authenticate against Artifact Registry:

gcloud auth configure-docker europe-west1-docker.pkg.dev

Now push the image:

docker push europe-west1-docker.pkg.dev/PROJECT_ID/formmail-repo/nms-formmail:latest

If you’d rather not install Docker locally, you can let Google Cloud Build do this for you:

gcloud builds submit \
  --tag europe-west1-docker.pkg.dev/PROJECT_ID/formmail-repo/nms-formmail:latest

Use whichever workflow your team is happier with; Cloud Run doesn’t care how the image got there.


Step 5 – Deploying to Cloud Run

Now we can create a Cloud Run service from that image.

You’ll need SMTP settings from somewhere (Mailgun, SES, your mail provider). I’ll use “Mailgun-ish” examples here; adjust as required.

gcloud run deploy nms-formmail \
  --image=europe-west1-docker.pkg.dev/PROJECT_ID/formmail-repo/nms-formmail:latest \
  --platform=managed \
  --region=europe-west1 \
  --allow-unauthenticated \
  --set-env-vars MSMTP_HOST=smtp.mailgun.org \
  --set-env-vars MSMTP_PORT=587 \
  --set-env-vars MSMTP_USER=postmaster@mg.example.com \
  --set-env-vars MSMTP_PASSWORD=YOUR_SMTP_PASSWORD \
  --set-env-vars MSMTP_FROM=webforms@example.com

Cloud Run will give you a HTTPS URL, something like:

https://nms-formmail-abcdefgh-uk.a.run.app

Your HTML form (on whatever website you like) can now post to that URL.

For example:

<form action="https://nms-formmail-abcdefgh-uk.a.run.app/formmail.pl" method="post">
  <input type="hidden" name="recipient" value="contact@example.com">
  <input type="email" name="email" required>
  <textarea name="comments" required></textarea>
  <button type="submit">Send</button>
</form>

Depending on how you wire the routes, you may also just post to / – the important point is that the request hits the PSGI app, which faithfully re-creates the CGI environment and hands control to formmail.pl.


How much work did we actually do?

Compared to the Dancer2 example, the interesting bit here is what we didn’t do:

  • We didn’t convert the CGI script to PSGI.

  • We didn’t add a framework.

  • We didn’t touch its mail-sending code.

We just:

  1. Wrapped it with CGI::Emulate::PSGI.

  2. Dropped a sendmail shim in front of a real SMTP service.

  3. Put it in a container and let Cloud Run handle the scaling and HTTPS.

If you’ve still got a cupboard full of old CGI scripts doing useful work, this is a nice way to:

  • get them off fragile shared hosting

  • put them behind HTTPS

  • run them in an environment you understand (Docker + Cloud Run)

  • without having to justify a full rewrite up front


When should you rewrite instead?

This trick is handy, but it’s not a time machine.

If you find yourself wanting to:

  • add tests

  • share logic between multiple scripts

  • integrate with a modern app or API

  • do anything more complex than “receive a form, send an email”

…then you probably do want to migrate the logic into a Dancer2 (or other PSGI) app properly.

But as a first step – or as a way to de-risk moving away from legacy hosting – wrapping CGI for Cloud Run works surprisingly well.


FormMail is still probably a bad idea

All of this proves that you can take a very old CGI script and run it happily on Cloud Run. It does not magically turn FormMail into a good idea in 2025.

The usual caveats still apply:

  • Spam and abuse – anything that will send arbitrary email based on untrusted input is a magnet for bots. You’ll want rate limiting, CAPTCHA, some basic content checks, and probably logging and alerting.

  • Validation and sanitisation – a lot of classic FormMail deployments were “drop it in and hope”. If you’re going to the trouble of containerising it, you should at least ensure it’s a recent nms version, configured properly, and locked down to only the recipients you expect.

  • Better alternatives – for any new project, you’d almost certainly build a tiny API endpoint or Dancer2 route that validates input, talks to a proper mail-sending service, and returns JSON. The CGI route is really a migration trick, not a recommendation for fresh code.

So think of this pattern as a bridge for legacy, not a template for greenfield development.


Conclusion

In the previous post we saw how nicely a modern Dancer2 app fits on Cloud Run: PSGI all the way down, clean deployment, no drama. This time we’ve taken almost the opposite starting point – a creaky old CGI FormMail – and shown that you can still bring it along for the ride with surprisingly little effort.

We didn’t rewrite the script, we didn’t introduce a framework, and we didn’t have to fake an entire 90s LAMP stack. We just wrapped the CGI in PSGI, dropped in a sendmail shim, and let Cloud Run do what it does best: run a container that speaks HTTP.

If you’ve got a few ancient Perl scripts quietly doing useful work on shared hosting, this might be enough to get them onto modern infrastructure without a big-bang rewrite. And once they’re sitting in containers, behind HTTPS, with proper logging and observability, you’ll be in a much better place to decide which ones deserve a full Dancer2 makeover – and which ones should finally be retired.

The post Elderly Camels in the Cloud first appeared on Perl Hacks.

(dlxxv) 8 great CPAN modules released last week

Niceperl
Updates for great CPAN modules released last week. A module is considered great if its favorites count is greater or equal than 12.

  1. App::Netdisco - An open source web-based network management tool.
    • Version: 2.095003 on 2025-11-18, with 799 votes
    • Previous CPAN version: 2.095002 was 2 days before
    • Author: OLIVER
  2. JSON::Schema::Modern - Validate data against a schema using a JSON Schema
    • Version: 0.623 on 2025-11-17, with 13 votes
    • Previous CPAN version: 0.622 was 8 days before
    • Author: ETHER
  3. Module::CoreList - what modules shipped with versions of perl
    • Version: 5.20251120 on 2025-11-20, with 44 votes
    • Previous CPAN version: 5.20251022 was 27 days before
    • Author: BINGOS
  4. Net::Amazon::S3 - Use the Amazon S3 - Simple Storage Service
    • Version: 0.992 on 2025-11-22, with 13 votes
    • Previous CPAN version: 0.991 was 3 years, 4 months, 5 days before
    • Author: BARNEY
  5. OpenTelemetry - A Perl implementation of the OpenTelemetry standard
    • Version: 0.033 on 2025-11-21, with 30 votes
    • Previous CPAN version: 0.032 was 1 day before
    • Author: JJATRIA
  6. SPVM - The SPVM Language
    • Version: 0.990107 on 2025-11-18, with 36 votes
    • Previous CPAN version: 0.990106 was 6 days before
    • Author: KIMOTO
  7. Type::Tiny - tiny, yet Moo(se)-compatible type constraint
    • Version: 2.008005 on 2025-11-20, with 145 votes
    • Previous CPAN version: 2.008004 was 1 month, 3 days before
    • Author: TOBYINK
  8. XML::Feed - XML Syndication Feed Support
    • Version: v1.0.0 on 2025-11-17, with 19 votes
    • Previous CPAN version: 0.65 was 1 year, 4 months, 8 days before
    • Author: DAVECROSS

Dave writes:

Last month was mostly spent doing a second big refactor of ExtUtils::ParseXS. My previous refactor converted the parser to assemble each XSUB into an Abstract Syntax Tree (AST) and only then emit the C code for it (previously the parsing and C code emitting were interleaved on the fly). This new work extends that so that the whole XS file is now one big AST, and the C code is only generated once all parsing is complete.

As well as fixing lots of minor parsing bugs along the way, another benefit of this big refactoring is that ExtUtils::ParseXS becomes manageable once again. Rather than one big 1400-line parsing loop, the parsing and code generating is split up into lots of little methods in subclasses which represent the nodes of the AST and which process just one thing.

As an example, the logic which handled (permissible) duplicate XSUB declarations in different C processor branches, such as

#ifdef USE_2ARG
int foo(int i, int j)
#else
int foo(int i)
#endif

used to be spread over many parts of the program; it's now almost all concentrated into the parsing and code-emitting methods of a single Node subclass.

This branch is currently pushed and undergoing review.

My earlier work on rewriting the XS reference manual, perlxs.pod, was made into a PR a month ago, and this month I revised it based on reviewers' feedback.

Summary: * 11:39 modernise perlxs.pod * 64:57 refactor Extutils::ParseXS: file-scoped AST

Total: * 76:36 (HH::MM)

Are you passionate about making an impact in one of the most dynamic tech sectors – payments in Europe?

Do you thrive in a fast-paced scale-up environment, surrounded by an ambitious and creative team?

We’re on a mission to make payments simple, secure, and accessible for every business. With powerful in-house technology and deep expertise, our modular platform brings online, in-person, and cross-border payments together in one place — giving merchants the flexibility to scale on their own terms. Through a partnership-first approach, we tackle complexity head-on, keep payments running smoothly, and boost success rates. It’s how we level the playing field for businesses of all sizes and ambitions.

Join a leading tech company driving innovation in the payments industry. You’ll work with global leaders like Visa and Mastercard, as well as next generation “pay later” solutions such as Klarna and Afterpay. Our engineering teams apply Domain-Driven Design (DDD) principles, microservices architecture to build scalable and maintainable systems.

•Develop and maintain Perl-based applications and systems to handle risk management, monitoring, and onboarding processes
•Collaborate with other developers, and cross-functional teams to define, design, and deliver new features and functionalities
•Assist in the migration of projects from Perl to other languages, such as Java, while ensuring the smooth operation and transition of systems
•Contribute to code reviews and provide valuable insights to uphold coding standards and best practices
•Stay up to date with the latest industry trends and technologies to drive innovation and enhance our products

Company policy is on-site with 1/2 workday from home depending on your location.
-

I needed to have some defaults available in my i3 configuration and was using LightDM. I asked in the i3 github discussion pages if people knew why it was failing. It appears Debian stripped some functionality. So how do you solve this?

Answer

You want to have your own session wrapper for lightdm. I stole this recipe from Ubuntu:

#!/usr/bin/sh

for file in "/etc/profile" "$HOME/.profile" "/etc/xprofile" "$HOME/.xprofile"; do
 [ ! -f "$file" ] && continue
 . $file
done

/etc/X11/Xsession $@

I install this in /usr/local/bin/lightdm-session. And then dpkg-divert the Debian version of lightdm.conf:

(dcxix) metacpan weekly report

Niceperl

This is the weekly favourites list of CPAN distributions. Votes count: 25

This week there isn't any remarkable distribution

Build date: 2025/11/16 11:03:31 GMT


Clicked for first time:


Increasing its reputation:


The Perl and Raku Foundation has announced a £1,000 sponsorship of the upcoming London Perl and Raku Workshop, reinforcing its ongoing commitment to supporting community-driven technical events. The workshop, one of the longest-running grassroots Perl gatherings in the UK, brings together developers, educators, and open-source enthusiasts for a day of talks, hands-on sessions, and collaborative learning centered on Perl, Raku, and related technologies.

The foundation’s contribution will help cover venue expenses, accessibility measures, and attendee resources. Organizers intend to use the support received from sponsros to keep the event free to attend, maintaining its tradition of lowering barriers for both newcomers and experienced programmers.

This year’s workshop is expected to feature a broad program, including presentations on language internals, modern development practices, and applied use cases within industry and research. Community members from across Europe are anticipated to participate, reflecting the workshop’s reputation as a focal point for Perl activity. The workshop is scheduled for November 29, 2025, in the heart of London at ISH Venues, near Regent's Park. Several speakers are already confirmed for this year's workshop, including Stevan Little and TPRF White Camel Award winners Sawyer X and Mohammad Sajid Anwar. For more information about the event, visit https://www.londonperlworkshop.com/.

By backing the event, The Perl and Raku Foundation continues its broader mission to foster growth, education, and innovation across both language communities. The London Perl Workshop remains one of the foundation’s key community touchpoints, offering a collaborative space for developers to share knowledge and help shape the future of the languages.

Episode 7 - CPAN Security Group

The Underbar
This is the last of the interviews recorded during the Perl Toolchain Summit 2025 in Leipzig, this time with the CPAN Security Group. We talked about how the group was formed, the security landscape for Perl and CPAN, and how volunteers are always needed.

Go Ahead ‘make’ My Day (Part III)

This is the last in a 3 part series on Scriptlets. You can catch up by reading our introduction and dissection of Scriptlets.

In this final part, we talk about restraint - the discipline that keeps a clever trick from turning into a maintenance hazard.

That uneasy feeling…

So you are starting to write a few scriptlets and it seems pretty cool. But something doesn’t feel quite right…

You’re editing a Makefile and suddenly you feel anxious. Ah, you expected syntax highlighting, linting, proper indentation, and maybe that warm blanket of static analysis. So when we drop a 20 - line chunk of Perl or Python into our Makefile, our inner OCD alarms go off. No highlighting. No linting. Just raw text.

The discomfort isn’t a flaw - it’s feedback. It tells you when you’ve added too much salt to the soup.

A scriptlet is not a script!

A scriptlet is a small, focused snippet of code embedded inside a Makefile that performs one job quickly and deterministically. The “-let” suffix matters. It’s not a standalone program. It’s a helper function, a convenience, a single brushstroke that belongs in the same canvas as the build logic it supports.

If you ever feel the urge to bite your nails, pick at your skin, or start counting the spaces in your indentation - stop. You’ve crossed the line. What you’ve written is no longer a scriptlet; it’s a script. Give it a real file, a shebang, and a test harness. Keep the build clean.

Why we use them

Scriptlets shine where proximity and simplicity matter more than reuse (not that we can’t throw it in a separate file and include it in our Makefile).

  • Cleanliness: prevents a recipe from looking like a shell script.
  • Locality: live where they’re used. No path lookups, no installs.
  • Determinism: transform well-defined input into output. Nothing more.
  • Portability (of the idea): every CI/CD system that can run make can run a one-liner.

A Makefile that can generate its own dependency file, extract version numbers, or rewrite a cpanfile doesn’t need a constellation of helper scripts. It just needs a few lines of inline glue.

Why they’re sometimes painful

We lose the comforts that make us feel like professional developers:

  • No syntax highlighting.
  • No linting or type hints.
  • No indentation guides.
  • No “Format on Save.”

The trick is to accept that pain as a necessary check on the limits of the scriptlet. If you’re constantly wishing for linting and editor help, it’s your subconscious telling you: this doesn’t belong inline anymore. You’ve outgrown the -let.

When to promote your scriplet to a script

Promote a scriptlet to a full-blown script when:

  • It exceeds 30-50 lines.
  • It gains conditionals or error handling.
  • You need to test it independently.
  • It uses more than 1 or 2 non-core features.
  • It’s used by more than one target or project.
  • You’re debugging quoting more than logic.
  • You’re spending more time fixing indentation, than working on the build

At that point, you’re writing software, not glue. Give it a name, a shebang, and a home in your tools/ directory.

When to keep it inside your Makefile

Keep it inline when:

  • It’s short, pure, and single-use.
  • It depends primarily on the environment already assumed by your build (Perl, Python, awk, etc.).
  • It’s faster to read than to reference.

A good scriptlet reads like a make recipe: do this transformation right here, right now.

define create_cpanfile =
    while (<STDIN>) {
        s/[#].*//; s/^\s+|\s+$//g; next if $_ eq q{};
        my ($mod,$v) = split /\s+/, $_, 2;
        print qq{requires "$mod", "$v";\n};
    }
endef

export s_create_cpanfile = $(value create_cpanfile)

That’s a perfect scriptlet: small, readable, deterministic, and local.

Rule of Thumb: If it fits on one screen, keep it inline. If it scrolls, promote it.

Tools for the OCD developer

If you must relieve the OCD symptoms without promotion of your scriptlet to a script

  • Add a lint-scriptlets target: perl -c -e '$(s_create_requires)' checks syntax without running it.
  • Some editors (Emacs mmm-mode, Vim polyglot) can treat marked sections as sub-languages to enable localized language specific editing features.
  • Use include to include a scriptlet into your Makefile

…however try to resist the urge to over-optimize the tooling. Feeling the uneasiness grow helps identify the boundary between scriptlets and scripts.

You’ve been warned!

Because scriptlets are powerful, flexible, and fast, it’s easy to reach for them too often or make them the focus of your project. They start as a cure for friction - a way to express a small transformation inline - but left unchecked, they can sometimes grow arms and legs. Before long, your Makefile turns into a Frankenstein monster.

The great philosopher Basho (or at least I think it was him) once said:

A single aspirin tablet eases pain. A whole bottle sends you to the hospital.

Thanks for reading.

Learn More

For years, most of my Perl web apps lived happily enough on a VPS. I had full control of the box, I could install whatever I liked, and I knew where everything lived.

In fact, over the last eighteen months or so, I wrote a series of blog posts explaining how I developed a system for deploying Dancer2 apps and, eventually, controlling them using systemd. I’m slightly embarrassed by those posts now.

Because the control that my VPS gave me also came with a price: I also had to worry about OS upgrades, SSL renewals, kernel updates, and the occasional morning waking up to automatic notifications that one of my apps had been offline since midnight.

Back in 2019, I started writing a series of blog posts called Into the Cloud that would follow my progress as I moved all my apps into Docker containers. But real life intruded and I never made much progress on the project.

Recently, I returned to this idea (yes, I’m at least five years late here!) I’ve been working on migrating those old Dancer2 applications from my IONOS VPS to Google Cloud Run. The difference has been amazing. My apps now run in their own containers, scale automatically, and the server infrastructure requires almost no maintenance.

This post walks through how I made the jump – and how you can too – using Perl, Dancer2, Docker, GitHub Actions, and Google Cloud Run.


Why move away from a VPS?

Running everything on a single VPS used to make sense. You could ssh in, restart services, and feel like you were in control. But over time, the drawbacks grow:

  • You have to maintain the OS and packages yourself.

  • One bad app or memory leak can affect everything else.

  • You’re paying for full-time CPU and RAM even when nothing’s happening.

  • Scaling means provisioning a new server — not something you do in a coffee break.

Cloud Run, on the other hand, runs each app as a container and only charges you while requests are being served. When no-one’s using your app, it scales to zero and costs nothing.

Even better: no servers to patch, no ports to open, no SSL certificates to renew — Google does all of that for you.


What we’ll build

Here’s the plan. We’ll take a simple Dancer2 app and:

  1. Package it as a Docker container.

  2. Build that container automatically in GitHub Actions.

  3. Deploy it to Google Cloud Run, where it runs securely and scales automatically.

  4. Map a custom domain to it and forget about server admin forever.

If you’ve never touched Docker or Cloud Run before, don’t worry – I’ll explain what’s going on as we go.


Why Cloud Run fits Perl surprisingly well

Perl’s ecosystem has always valued stability and control. Containers give you both: you can lock in a Perl version, CPAN modules, and any shared libraries your app needs. The image you build today will still work next year.

Cloud Run runs those containers on demand. It’s effectively a managed starman farm where Google handles the hard parts – scaling, routing, and HTTPS.

You pay for CPU and memory per request, not per server. For small or moderate-traffic Perl apps, it’s often well under £1/month.


Step 1: Dockerising a Dancer2 app

If you’re new to Docker, think of it as a way of bundling your whole environment — Perl, modules, and configuration — into a portable image. It’s like freezing a working copy of your app so it can run identically anywhere.

Here’s a minimal Dockerfile for a Dancer2 app:

FROM perl:5.42
LABEL maintainer="dave@perlhacks.com"

# Install Carton and Starman
RUN cpanm Carton Starman

# Copy the app into the container
COPY . /app
WORKDIR /app

# Install dependencies
RUN carton install --deployment

EXPOSE 8080
CMD ["carton", "exec", "starman", "--port", "8080", "bin/app.psgi"]

Let’s break that down:

  • FROM perl:5.42 — starts from an official Perl image on Docker Hub.

  • Carton keeps dependencies consistent between environments.

  • The app is copied into /app, and carton install --deployment installs exactly what’s in your cpanfile.snapshot.

  • The container exposes port 8080 (Cloud Run’s default).

  • The CMD runs Starman, serving your Dancer2 app.

To test it locally:

docker build -t myapp .
docker run -p 8080:8080 myapp

Then visit http://localhost:8080. If you see your Dancer2 homepage, you’ve successfully containerised your app.


Step 2: Building the image in GitHub Actions

Once it works locally, we can automate it. GitHub Actions will build and push our image to Google Artifact Registry whenever we push to main or tag a release.

Here’s a simplified workflow file (.github/workflows/build.yml):

name: Build container

on:
  push:
    branches: [ main ]
    tags: [ 'v*' ]
  workflow_dispatch:

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - uses: google-github-actions/setup-gcloud@v3
        with:
          project_id: ${{ secrets.GCP_PROJECT }}
          service_account_email: ${{ secrets.GCP_SA_EMAIL }}
          workload_identity_provider: ${{ secrets.GCP_WIF_PROVIDER }}

      - name: Build and push image
        run: |
          IMAGE="europe-west1-docker.pkg.dev/${{ secrets.GCP_PROJECT }}/containers/myapp:$GITHUB_SHA"
          docker build -t $IMAGE .
          docker push $IMAGE

You’ll notice a few secrets referenced in the workflow — things like your Google Cloud project ID and credentials. These are stored securely in GitHub Actions. When the workflow runs, GitHub uses those secrets to authenticate as you and access your Google Cloud account, so it can push the new container image or deploy your app.

You only set those secrets up once, and they’re encrypted and hidden from everyone else — even if your repository is public.

Once that’s set up, every push builds a fresh, versioned container image.


Step 3: Deploying to Cloud Run

Now we’re ready to run it in the cloud. We’ll do that using Google’s command line program, gcloud. It’s available from Google’s official downloads or through most Linux package managers — for example:

# Fedora, RedHat or similar
sudo dnf install google-cloud-cli
# or on Debian/Ubuntu:
sudo apt install google-cloud-cli

Once installed, authenticate it with your Google account:

gcloud auth login
gcloud config set project your-project-id

That links the CLI to your Google Cloud project and lets it perform actions like deploying to Cloud Run.

Once that’s done, you can deploy manually from the command line:

gcloud run deploy myapp \
--image=europe-west1-docker.pkg.dev/MY_PROJECT/containers/myapp:$GITHUB_SHA \
--region=europe-west1 \
--allow-unauthenticated \
--port=8080

This tells Cloud Run to start a new service called myapp, using the image we just built.

After a minute or two, Google will give you a live HTTPS URL, like:

    • https://myapp-abcdef12345-ew.a.run.app

Visit it — and if all went well, you’ll see your familiar Dancer2 app, running happily on Cloud Run.

To connect your own domain, run:

gcloud run domain-mappings create \
--service=myapp \
--domain=myapp.example.com

Then update your DNS records as instructed. Within an hour or so, Cloud Run will issue a free SSL certificate for you.


Step 4: Automating the deployment

Once the manual deployment works, we can automate it too.

Here’s a second GitHub Actions workflow (deploy.yml) that triggers after a successful build:

name: Deploy container

on:
  workflow_run:
    workflows: [ "Build container" ]
    types: [ completed ]

jobs:
  deploy:
    runs-on: ubuntu-latest
    if: ${{ github.event.workflow_run.conclusion == 'success' }}

    steps:
      - uses: google-github-actions/setup-gcloud@v3
        with:
          project_id: ${{ secrets.GCP_PROJECT }}
          service_account_email: ${{ secrets.GCP_SA_EMAIL }}
          workload_identity_provider: ${{ secrets.GCP_WIF_PROVIDER }}

      - name: Deploy to Cloud Run
        run: |
          gcloud run deploy myapp \
            --image=europe-west1-docker.pkg.dev/${{ secrets.GCP_PROJECT }}/containers/myapp:$GITHUB_SHA \
            --region=europe-west1 \
            --allow-unauthenticated \
            --port=8080

Now every successful push to main results in an automatic deployment to production.

You can take it further by splitting environments — e.g. main deploys to staging, tagged releases to production — but even this simple setup is a big step forward from ssh and git pull.


Step 5: Environment variables and configuration

Each Cloud Run service can have its own configuration and secrets. You can set these from the console or CLI:

gcloud run services update myapp \
  --set-env-vars="DANCER_ENV=production,DATABASE_URL=postgres://..."

In your Dancer2 app, you can then access them with:

$ENV{DATABASE_URL}

It’s a good idea to keep database credentials and API keys out of your code and inject them at deploy time like this.


Step 6: Monitoring and logs

Cloud Run integrates neatly with Google Cloud’s logging tools.

To see recent logs from your app:

gcloud logs read --project=$PROJECT_NAME --service=myapp

You’ll see your Dancer2 warn and die messages there too, because STDOUT and STDERR are automatically captured.

If you prefer a UI, you can use the Cloud Console’s Log Explorer to filter by service or severity.


Step 7: The payoff

Once you’ve done one migration, the next becomes almost trivial. Each Dancer2 app gets:

  • Its own Dockerfile and GitHub workflows.

  • Its own Cloud Run service and domain.

  • Its own scaling and logging.

And none of them share a single byte of RAM with each other.

Here’s how the experience compares:

Aspect Old VPS Cloud Run
OS maintenance Manual upgrades Managed
Scaling Fixed size Automatic
SSL Let’s Encrypt renewals Automatic
Deployment SSH + git pull Push to GitHub
Cost Fixed monthly Pay-per-request
Downtime risk One app can crash all Each isolated

For small apps with light traffic, Cloud Run often costs pennies per month – less than the price of a coffee for peace of mind.


Lessons learned

After a few migrations, a few patterns emerged:

  • Keep apps self-contained. Don’t share config or code across services; treat each app as a unit.

  • Use digest-based deploys. Deploy by image digest (@sha256:...) rather than tag for true immutability.

  • Logs are your friend. Cloud Run’s logs are rich; you rarely need to ssh anywhere again.

  • Cold starts exist, but aren’t scary. If your app is infrequently used, expect the first request after a while to take a second longer.

  • CI/CD is liberating. Once the pipeline’s in place, deployment becomes a non-event.


Costs and practicalities

One of the most pleasant surprises was the cost. My smallest Dancer2 app, which only gets a handful of requests each day, usually costs under £0.50/month on Cloud Run. Heavier ones rarely top a few pounds.

Compare that to the £10–£15/month I was paying for the old VPS — and the VPS didn’t scale, didn’t auto-restart cleanly, and didn’t come with HTTPS certificates for free.


What’s next

This post covers the essentials: containerising a Dancer2 app and deploying it to Cloud Run via GitHub Actions.

In future articles, I’ll look at:

  • Connecting to persistent databases.

  • Using caching.

  • Adding monitoring and dashboards.

  • Managing secrets with Google Secret Manager.


Conclusion

After two decades of running Perl web apps on traditional servers, Cloud Run feels like the future has finally caught up with me.

You still get to write your code in Dancer2 – the framework that’s made Perl web development fun for years – but you deploy it in a way that’s modern, repeatable, and blissfully low-maintenance.

No more patching kernels. No more 3 a.m. alerts. Just code, commit, and dance in the clouds.

The post Dancing in the Clouds: Moving Dancer2 Apps from a VPS to Cloud Run first appeared on Perl Hacks.

Go Ahead ‘make’ My Day (Part II)

In our previous blog post “Go Ahead ‘make’ My Day” we presented the scriptlet, an advanced make technique for spicing up your Makefile recipes. In this follow-up, we’ll deconstruct the scriptlet and detail the ingredients that make up the secret sauce.


Introducing the Scriptlet

Makefile scriptlets are an advanced technique that uses GNU make’s powerful functions to safely embed a multi-line script (Perl, in our example) into a single, clean shell command. It turns a complex block of logic into an easily executable template.

An Example Scriptlet

#-*- mode: makefile; -*-

DARKPAN_TEMPLATE="https://cpan.openbedrock.net/orepan2/authors/D/DU/DUMMY/%s-%s.tar.gz"

define create_requires =
 # scriptlet to create cpanfile from an list of required Perl modules
 # skip comments
 my $DARKPAN_TEMPLATE=$ENV{DARKPAN_TEMPLATE};

 while (s/^#[^\n]+\n//g){};

 # skip blank lines
 while (s/\n\n/\n/) {};

 for (split/\n/) { 
  my ($mod, $v) = split /\s+/;
  next if !$mod;

  my $dist = $mod;
  $dist =~s/::/\-/g;

  my $url = sprintf $DARKPAN_TEMPLATE, $dist, $v;

  print <<"EOF";
requires \"$mod\", \"$v\",
  url => \"$url\";
EOF
 }

endef

export s_create_requires = $(value create_requires)

cpanfile.darkpan: requires.darkpan
    DARKPAN_TEMPLATE=$(DARKPAN_TEMPLATE); \
    DARKPAN_TEMPLATE=$$DARKPAN_TEMPLATE perl -0ne "$$s_create_requires" $< > $@ || rm $@

Dissecting the Scriptlet

1. The Container: Defining the Script (define / endef)

This section creates the multi-line variable that holds your entire Perl program.

define create_requires =
# Perl code here...
endef
  • define ... endef: This is GNU Make’s mechanism for defining a recursively expanded variable that spans multiple lines. The content is not processed by the shell yet; it’s simply stored by make.
  • The Advantage: This is the only clean way to write readable, indented code (like your while loop and if statements) directly inside a Makefile.

2. The Bridge: Passing Environment Data (my $ENV{...})

This is a critical step for making your script template portable and configurable.

my $DARKPAN_TEMPLATE=$ENV{DARKPAN_TEMPLATE};
  • The Problem: Your Perl script needs dynamic values (like the template URL) that are set by make.
  • The Solution: Instead of hardcoding the URL, the Perl code is designed to read from the shell environment variable $ENV{DARKPAN_TEMPLATE}. This makes the script agnostic to its calling environment, delegating the data management back to the Makefile.

3. The Transformer: Shell Preparation (export and $(value))

This is the “magic” that turns the multi-line Make variable into a single, clean shell command.

export s_create_requires = $(value create_requires)
  • $(value create_requires): This is a specific Make function that performs a direct, single-pass expansion of the variable’s raw content. Crucially, it converts the entire multi-line block into a single-line string suitable for export, preserving special characters and line breaks that the shell will execute.
  • export s_create_requires = ...: This exports the multi-line Perl script content as an environment variable (s_create_requires) that will be accessible to any shell process running in the recipe’s environment.

4. The Execution: Atomic Execution ($$ and perl -0ne)

The final recipe executes the entire, complex process as a single, atomic operation, which is the goal of robust Makefiles.

cpanfile.darkpan: requires.darkpan
    DARKPAN_TEMPLATE=$(DARKPAN_TEMPLATE); \
    DARKPAN_TEMPLATE=$$DARKPAN_TEMPLATE perl -0ne "$$s_create_requires" $< > $@ || rm $@
  • DARKPAN_TEMPLATE=$(DARKPAN_TEMPLATE): This creates the local shell variable.
  • DARKPAN_TEMPLATE=$$DARKPAN_TEMPLATE perl...: This is the clean execution. The first DARKPAN_TEMPLATE= passes the newly created shell variable’s value as an environment variable to the perl process. The $$ ensures the shell variable is properly expanded before the Perl interpreter runs it.
  • perl -0ne "...": Runs the Perl script:
    * `-n` and `-e` (Execute script on input)
    * `-0`: Tells Perl to read the input as one single block
      (slurping the file), which is necessary for your multi-line
      regex and `split/\n/` logic.
    
  • || rm $@: This is the final mark of quality. It makes the entire command transactional—if the Perl script fails, the half-written target file ($@) is deleted, forcing make to try again later.

Hey Now! You’re a Rockstar!

(..get your game on!)

Mastering build automation using make will transform you from being an average DevOps engineer into a rockstar. GNU make is a Swiss Army knife with more tools than you might think! The knives are sharp and the tools are highly targeted to handle all the real-world issues build automation has encountered over the decades. Learning to use make effectively will put you head and shoulders above the herd (see what I did there? 😉).

Calling All Pythonistas!

The scriptlet technique creates a powerful, universal pattern for clean, atomic builds:

  • It’s Language Agnostic: Pythonistas! Join the fun! The same define/export technique works perfectly with python -c.
  • The Win: This ensures that every developer - regardless of their preferred language - can achieve the same clean, atomic build and avoid external script chaos.

Learn more about GNU make and move your Makefiles from simple shell commands to precision instruments of automation.

Thanks for reading.

Learn More

Shutter crashing

Perl Maven

A long time ago I used Shutter and found it as an excellent tool. Now I get all kinds of crashes.

Actually "Now" was a while ago, since then I upgraded Ubuntu and now I get all kinds of other error messages.

However, I wonder.

Why are there so many errors?

Who's fault is it?

  • A failure of the Perl community?

  • A failure of the Ubuntu or the Debian developers?

  • A failure of the whole idea of Open Source?

  • Maybe I broke the system?

It starts so badly and then it crashes. I don't want to spend time figuring out what is the problem. I don't even have the energy to open a ticket. I am not even sure where should I do it. On Ubuntu? On the Shutter project?

Here is the output:

$ shutter
Subroutine Pango::Layout::set_text redefined at /usr/share/perl5/Gtk3.pm line 2299.
	require Gtk3.pm called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
Subroutine Pango::Layout::set_markup redefined at /usr/share/perl5/Gtk3.pm line 2305.
	require Gtk3.pm called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
GLib-GObject-CRITICAL **: g_boxed_type_register_static: assertion 'g_type_from_name (name) == 0' failed at /usr/lib/x86_64-linux-gnu/perl5/5.36/Glib/Object/Introspection.pm line 110.
 at /usr/share/perl5/Gtk3.pm line 489.
	Gtk3::import("Gtk3", "-init") called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
GLib-CRITICAL **: g_once_init_leave: assertion 'result != 0' failed at /usr/lib/x86_64-linux-gnu/perl5/5.36/Glib/Object/Introspection.pm line 110.
 at /usr/share/perl5/Gtk3.pm line 489.
	Gtk3::import("Gtk3", "-init") called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
GLib-GObject-CRITICAL **: g_boxed_type_register_static: assertion 'g_type_from_name (name) == 0' failed at /usr/lib/x86_64-linux-gnu/perl5/5.36/Glib/Object/Introspection.pm line 110.
 at /usr/share/perl5/Gtk3.pm line 489.
	Gtk3::import("Gtk3", "-init") called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
GLib-CRITICAL **: g_once_init_leave: assertion 'result != 0' failed at /usr/lib/x86_64-linux-gnu/perl5/5.36/Glib/Object/Introspection.pm line 110.
 at /usr/share/perl5/Gtk3.pm line 489.
	Gtk3::import("Gtk3", "-init") called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
GLib-GObject-CRITICAL **: g_boxed_type_register_static: assertion 'g_type_from_name (name) == 0' failed at /usr/lib/x86_64-linux-gnu/perl5/5.36/Glib/Object/Introspection.pm line 110.
 at /usr/share/perl5/Gtk3.pm line 489.
	Gtk3::import("Gtk3", "-init") called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
GLib-CRITICAL **: g_once_init_leave: assertion 'result != 0' failed at /usr/lib/x86_64-linux-gnu/perl5/5.36/Glib/Object/Introspection.pm line 110.
 at /usr/share/perl5/Gtk3.pm line 489.
	Gtk3::import("Gtk3", "-init") called at /usr/bin/shutter line 72
	Shutter::App::BEGIN() called at /usr/bin/shutter line 72
	eval {...} called at /usr/bin/shutter line 72
Variable "$progname_active" will not stay shared at /usr/bin/shutter line 2778.
Variable "$progname" will not stay shared at /usr/bin/shutter line 2779.
Variable "$im_colors_active" will not stay shared at /usr/bin/shutter line 2787.
Variable "$combobox_im_colors" will not stay shared at /usr/bin/shutter line 2788.
Variable "$trans_check" will not stay shared at /usr/bin/shutter line 2798.


... About 700 similar error messages ...


Name "Gtk3::Gdk::SELECTION_CLIPBOARD" used only once: possible typo at /usr/bin/shutter line 291.
WARNING: gnome-web-photo is missing --> screenshots of websites will be disabled!

 at /usr/bin/shutter line 9038.
	Shutter::App::fct_init_depend() called at /usr/bin/shutter line 195
Useless use of hash element in void context at /usr/share/perl5/Shutter/App/Common.pm line 77.
	require Shutter/App/Common.pm called at /usr/bin/shutter line 206
Useless use of hash element in void context at /usr/share/perl5/Shutter/App/Common.pm line 80.
	require Shutter/App/Common.pm called at /usr/bin/shutter line 206
Subroutine lookup redefined at /usr/share/perl5/Shutter/Draw/DrawingTool.pm line 28.
	require Shutter/Draw/DrawingTool.pm called at /usr/bin/shutter line 228
Variable "$self" will not stay shared at /usr/share/perl5/Shutter/Draw/DrawingTool.pm line 671.
	require Shutter/Draw/DrawingTool.pm called at /usr/bin/shutter line 228
Variable "$self" will not stay shared at /usr/share/perl5/Shutter/Screenshot/SelectorAdvanced.pm line 840.
	require Shutter/Screenshot/SelectorAdvanced.pm called at /usr/bin/shutter line 233
Failed to register: GDBus.Error:org.freedesktop.DBus.Error.NoReply: Message recipient disconnected from message bus without replying

Adding tests to legacy Perl code

Perl Maven

Notes from the live-coding session (part of the Perl Maven live events.

Meeting summary

Quick recap

The meeting began with informal introductions and discussions about Perl programming, including experiences with maintaining Perl codebases and the challenges of the language's syntax. The main technical focus was on testing and code coverage, with detailed demonstrations of using Devel::Cover and various testing modules in Perl, including examples of testing SVG functionality and handling exceptions. The session concluded with discussions about testing practices, code coverage implementation, and the benefits of automated testing, while also touching on practical aspects of Perl's object-oriented programming and error handling features.

SVG Test Coverage Analysis

Gabor demonstrated how to use Devel::Cover to generate test coverage reports for the SVG.pm module. He showed that the main module has 98% coverage, while some submodules have lower coverage. Gabor explained how to interpret the coverage reports, including statement, branch, and condition coverage. He also discussed the importance of identifying and removing unused code that appears uncovered by tests. Gabor then walked through some example tests in the SVG distribution, explaining how they verify different aspects of the SVG module's functionality.

Original announcement

Adding tests to legacy Perl code

During this live coding event we'll take a Perl module from CPAN and add some tests to it.

Further events

Register on the Perl Maven Luma calendar.