News items for tag perl - Koos van den Hout

2010-10-27 (#) 3 years ago
I introduced a MediaWiki at work (science ict department) to use for internal documentation. One of the things I wanted to try is pages in the wiki created or maintained from other sources.

I created a special namespace for pages with information from other sources, where normal users have no rights to edit pages. This is to make sure nobody tries to edit something which is maintained by a script from another source.

I started with something simple: the list of printers. The windows printserver is leading, so I want to fetch the list there and massage it to generate a list of printers and comments. The weapon of choice is perl and MediaWiki::Bot. The output of smbclient -N -L printserver takes one regexp to find printqueuenames and descriptions. For the overview of cups queues I can parse the output of lpstat -a. With a bit more digging into IPP it should also be possible to get a list of details of printers to link cups queues and their windows counterparts.

I can run this script from crontab each day and the history tracking in MediaWiki will start to help document when something changed. Another thing which we can stop worrying about.

I have visions of the future of automatically linking zabbix (which has a json interface) and mediawiki and maybe a further future with a good database of stuff which is a source of entries in zabbix and the wiki. Double work is unneeded, computers are much better at working with one canonical source and importing that in a lot of places.

Tags: , , ,
2009-08-24 (#) 5 years ago
More than one visitor of my homepage saw an intricate XML parsing error and not the page you all want to see. I never saw the problem myself but my best guess sofar is that the twitter rss feed was malformed, because that is the only XML parsing happening for the page. I fetch the twitter feed automatically every 6 hours, but sometimes twitter is a bit overloaded and probably gives an internal error page (the famous fail whale) and not the valid rss feed.

Solution: Fetch the file to a temporary file, run the parser on it and when the parser does not fail, copy it to where the webserver reads it:

#!/bin/sh

wget -O wwwdata/twitter.rss.pre -o /dev/null http://twitter.com/statuses/user_timeline/19301166.rss

perl -MXML::RSS -e 'my $rss=new XML::RSS; $rss->parsefile("wwwdata/twitter.rss.pre");'

if [ "$?" = "0" ]; then
	cp wwwdata/twitter.rss.pre wwwdata/twitter.rss
fi

Tags: , , ,
2009-06-12 (#) 5 years ago
Wat perl code om met Chipcard::PCSC het saldo en de laatste 5 transacties van een chipknip uit te lezen. Altijd handig als je toevallig een laptop met chipcard slot hebt. Het leuke is dat je ook kan zien wat het eigen nummer was van de chipautomaat waar je de transactie deed (SAM_ID) en het volgnummer (SAM_STAN) waarmee je kan zien waar je chipknip geweest is en kan zien hoeveel transacties automaten te verwerken krijgen. Perl code om een chipknip saldo te lezen en de laatste transacties.

Update 2012-04-10: Nee, ik heb helaas de originele java code niet meer waar ik veldgroottes en veldnamen aan ontleend heb. Slecht van me, ik had daar naar moeten verwijzen.

Tags: , ,
2009-03-28 (#) 5 years ago
I noticed a few malformed characters in the RSS feed of my homepage that weren't there in the original database entries and showed ok in the web version. Again, utf-8 problems showing, although all data (postgres - script - xml - browser) should be utf-8. Lots of testing and searching, finally I found The Perl UTF-8 and utf8 Encoding Mess by Jeremy Zawodny. He is right: it is a mess. And the post itself demonstrates it by being filled with � characters.
So to make sure everything in the RSS generating process understand that what comes out of PostgreSQL is valid utf-8 and should be imported in the XML::RSS module as the same valid utf-8, I need to recode it to utf-8. Uh.. ok. The bit of code:
        my $body = Encode::decode('UTF-8', $row[1]);
And now I can use ÜTF-8 çħáräćtërs!

Tags: , ,
2008-03-20 (#) 6 years ago
A new version of my homepage, rewritten in perl because PHP was starting to irritate me. More database-driven in the background which allows me to add things like the tags. And a minor change in the colour scheme because someone remarked that the black-on-cyan was hard to read for people above a certain age.

Tags: , ,
2008-03-13 (#) 6 years ago
One of the little irritations at work was trying to find out what the exact error was of the printer when the helpdesk ticket just says 'printer problems'. Since HP laserjets will divulge everything via SNMP, I thought the complete information must be available. It is, and I gobbled together a perl script for our noc webserver. Public version in the perl noc stuff page.

Tags: , ,
2008-01-02 (#) 6 years ago
My first CPAN upload. I uploaded Geo::METAR 1.15 to CPAN just now. Time to find out if I did stuff right.

Tags: , , , ,
2004-10-11 (#) 9 years ago
Played a bit with Geo::METAR so now my Webcam knows the almost-local weather and pastes it into the image.

Tags: , ,

IPv6 readyIPv6 ready
, reachable as koos+website@koos.idefix.net. PGP key DSS/1024 0xF0D7C263 RSS
Other webprojects: Camp Wireless, wireless Internet access at campsites, The Virtual Bookcase, book reviews, Weather maps