News items for tag web - Koos van den Hout

2022-11-18 SSL scans showing up in the log
A comment on irc made me have a look at the logs for my haproxy system to get an idea whether any weird vulnerability scan came by. No special vulnerability scan showed up, but my attention was drawn to a number of lines like:
Nov 18 08:05:01 wozniak haproxy[13987]: 2001:470:1:332::28:37618 [18/Nov/2022:08:05:01.900] https-in/1: SSL handshake failure
Nov 18 08:05:44 wozniak haproxy[13987]: 2001:470:1:332::28:27286 [18/Nov/2022:08:05:44.328] https-in/1: SSL handshake failure
Nov 18 08:06:22 wozniak haproxy[13987]: 2001:470:1:332::2e:3137 [18/Nov/2022:08:06:21.962] https-in/1: SSL handshake failure
Nov 18 08:06:22 wozniak haproxy[13987]: 2001:470:1:332::2d:33085 [18/Nov/2022:08:06:22.278] https-in/1: SSL handshake failure
Nov 18 08:06:22 wozniak haproxy[13987]: 2001:470:1:332::2d:17531 [18/Nov/2022:08:06:22.593] https-in/1: SSL handshake failure
Nov 18 08:06:22 wozniak haproxy[13987]: 2001:470:1:332::30:58869 [18/Nov/2022:08:06:22.915] https-in/1: SSL handshake failure
Nov 18 08:06:23 wozniak haproxy[13987]: 2001:470:1:332::2e:46537 [18/Nov/2022:08:06:23.228] https-in/1: SSL handshake failure
Nov 18 08:06:23 wozniak haproxy[13987]: 2001:470:1:332::29:20027 [18/Nov/2022:08:06:23.544] https-in/1: SSL handshake failure
Nov 18 08:06:24 wozniak haproxy[13987]: 2001:470:1:332::31:13423 [18/Nov/2022:08:06:23.872] https-in/1: SSL handshake failure
Nov 18 08:06:24 wozniak haproxy[13987]: 2001:470:1:332::28:56683 [18/Nov/2022:08:06:24.197] https-in/1: SSL handshake failure
Nov 18 08:06:24 wozniak haproxy[13987]: 2001:470:1:332::31:5055 [18/Nov/2022:08:06:24.524] https-in/1: SSL handshake failure
Nov 18 08:06:24 wozniak haproxy[13987]: 2001:470:1:332::2e:20907 [18/Nov/2022:08:06:24.841] https-in/1: SSL handshake failure
If there is one of two of these lines from one address, it is a sign of a client which can't finish the SSL negotiation. With my site that probably means and old client which doesn't understand LetsEncrypt certificates without an extra certification path.

But this is quote a number of SSL errors from the same IPv6 range in a short time. I wondered what was behind this and did a bit of testing, until I found it's simple to cause this by doing an SSL test. For example with the famous Qualys SSL test or with an ssl scan tool. This is logical: ssltest uses a lot of different negotiations to test what actually works.

Tags: , ,
2022-10-31 Trying mastodon for amateur radio
All the news about twitter makes me wonder if I want to stay there in the long run.

But changing a social network is always a negative experience, you lose contacts. I still remember some several people who I haven't heard much from since google+ and wonder how they are doing!

For amateur radio I'm having a look at mastodon as @PE4KH@mastodon.radio.

One conclusion is that my own site is more permanent than any social media. My own website survived the rise and fall of google+ while importing my posts so those are still available here. But interaction on my own site is complex and needs constant maintenance to avoid spam.

Tags: , , ,
2022-08-26 Limiting URLs to scan with wapiti
I wanted to use wapiti as scanner to check for other vulnerabilities in The Virtual Bookcase after receiving a report about a cross-site scripting vulnerability. Wapiti is open source and free, which is a fitting price for scanning a hobby project site.

I quickly ran into wapiti taking hours to scan because of the URL structure of the site: all /book/detail/x/y URLs map to one handler that deals with the X and Y parameters in SQL queries. Yes those queries are surrounded by very defensive checking and I use positional parameters. Everything to avoid SQL injection and becoming the next Little Bobby Tables.

Wapiti has no simple method that I can find to crawl for a list of URLs and stop at that to allow for selecting the list of URLs to scan. But it has an option to minimize crawling and import a list of additional URLs to scan so I used that option to get at the same result.

Gathering URLs was done with wget:
$ wget --spider -r http://developer.virtualbookcase.com 2>&1 | grep '^--' | egrep -v '\.(css|jpg|gif|png)' | awk '{ print $3}' > developer.virtualbookcase.com-urls.txt
After that I sorted the file with URLs and threw out a lot of them, making sure all the scripts with several variants of input were still tested.

With that list I start wapiti with some special options. It still needs a starting url at -u so I give it the root but I limit the crawling with the depth parameter -d 1 and the max files parameter --max-files-per-dir 50. Then I add the additional urls from the earlier scan with the -s parameter. It's a lot of tweaking but it does the trick.
$ wapiti -u http://developer.virtualbookcase.com/ -d 1 --max-files-per-dir 50 -s developer.virtualbookcase.com-urls.txt -o ~/wapiti/ -v 2
No vulnerabilities were found. I found one PHP warning which only triggered in the kind of corner case a web vulnerability scanner causes, or an attacker. So I fixed that corner case too.

Tags: , , ,
2022-08-25 D'oh!!! A cross-site scripting vulnerability in one of my own sites
I received a responsible disclosure report of a vulnerability in The Virtual Bookcase.

I will directly admit I haven't done a lot of maintenance on this site in the past few years but I want to keep my sites secure.

The report came via openbugbounty.org and has no details about the vulnerability, so I am not 100% sure where the reported vulnerability is. But based on the report text XSS (Cross Site Scripting) and a peek in the access-log looking for specific requests I found I made a beginner mistake in dealing with a search query: displaying it as-is within an HTML context. I immediately fixed that error in the site.

Now I wonder why it took so long for me to realize the error of my ways or for someone to notice it!

Checking the logs some more finds huge amounts of attempts at SQL injection, which is a vulnerability I am very aware of and where I put up standard defenses. But this is the first time a security researcher made me aware of the cross-site scripting vulnerability.

Update: I contacted the reporter about the vulnerability who responded quickly inquiring about the possible bounty for finding the bug. As this is a site that hasn't delivered any income in years the best I can do is a mention in the credits of the site or on a separate hall of fame.

Update: I also started a vulnerability scanner on the site myself, to find any other vulnerabilities I might have missed. This scanner is going through the development site at the moment. Like many other scanners it doesn't see by default how certain urls all map to the same PHP script.

I already committed a few minor updates to improve handling of corner cases in not set variables and other things popping up in the scan.

Update 2022-09-23: I realized the reporter has never responded with the actual bug information.

Tags: , , ,
2022-07-20 I redid my 'recent QSO map' with leafletjs and openstreetmap tiles
Screenshot pe4kh qso map faroer island My todo-list for hobby projects has had an entry 'redo maps in sites using leaflet' for a while and on an otherwise calm evening I got around to it. The first thing to upgrade was the recent contact map for PE4KH which shows an overview of places where I had the last 150 contacts plotted on a map, with some details per contact.

I'm not good at javascript programming at all so I just look for examples that come close to what I want and I adjust them until they do what I want. Luckily I found some good geojson examples and I managed to get the points on the map. After a bit of massaging, trying and reading I managed to add the popup with the location. The next and harder bit was adding default and non-default icons. Eventually I got my brain wrapped around the bits needed for that too. After that the test version got deployed to production and you can look at it now.

Documentation and code snippets used: The main reasons for switching to leaflet are that google maps was limiting free access to maps although they seem to have mostly reverted this plan and I wanted to promote openstreetmap.

The general conclusion is that sites with maps do need regular maintenance, if hosted leaflet goes away or stops this version, if the rules for using hosted openstreetmap tiles change or if something else happens I have to adapt the site, maybe even quite fast.

Tags: , ,
2022-07-13 Adding pictures to the reports of our trip to Iceland
I created a flickr album Iceland 2022 - Our trip to Iceland in April/May 2022 and linking to the pictures from the right report was still kind of hard because it's a complicated bit of html with repetitions and chances of errors.

The solution: make the computer help me. The flickr API allows me to fetch data about an album and about the pictures in that album, so I spent an evening writing some perl to get links to all the pictures in the album with thumbnails.

Now most days of Complete reports of our trip to Iceland have been enhanced with pictures.

Tags: , , ,
2022-06-22 Bijhouden software netwerkgroep website
Van tijd tot tijd controleer ik of de netwerkgroep weblog nog updates nodig heeft aan de Serendipity weblog software die draait sinds de installatie van Serendipity voor de netwerkgroep in 2006. Nu is Serendipity zeer veilig geschreven want dit komt maar heel zelden voor.

Toch bleek het vandaag nodig te zijn en kreeg ik de update niet in een keer rond. Blijkbaar stonden er nog wat rechten verkeerd sinds de virtuele machine voor de webserver een crash had gehad. Na het oplossen hiervan draait het weer rustig door.

In de oude artikelen gaan natuurlijk wel eens links naar externe sites kapot. Daar doe ik nog wat onderhoud aan.

Tags: , ,
2021-11-19 Attacks on new sites are fast!
I was working on a new site for a project and requested a certificate for it. The time between the certificate being generated and the first attack was 3 minutes and 7 seconds.

15:12:10 UTC: certificate generated and published on the certificate transparancy log
15:15:17 UTC:
185.67.34.1 - - [19/Nov/2021:16:15:17 +0100] "GET /restapi.php HTTP/1.1" 404 1008 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:84.0) Gecko/20100101 Firefox/84.0"
15:15:18 UTC:
185.67.34.1 - - [19/Nov/2021:16:15:18 +0100] "POST /gate.php HTTP/1.1" 404 1008 "-" "Dalvik/2.1.0 (Linux; U; Android 6.0.1; SM-J700F Build/MMB29K)"

Tags: , ,
2021-10-18 Securing the home network: a separate DMZ network
I have a lot of control over the software that runs on systems at home but there are limits to what I can fix and sometimes things are insecure.

Things like the recent wordpress brute force attacks show that random 'loud' attackers who don't care about the chance of getting noticed will try. I sometimes do worry about the silent and more targeted attackers.

So recently I updated my home network and I now have a DMZ network. At this moment it is a purely virtual network as it doesn't leave the KVM server. Hosts in the DMZ have a default-deny firewall policy to the other inside networks. Specific services on specific hosts have been enabled.

I first moved the development webserver, which allowed me to tune those firewall rules and fix some other errors.

Now other webservers and other servers offering things to the outside world have moved.

Tags: , , ,
2021-10-07 Adding security headers to websites I develop and run
As someone interested in security I'm also busy with securing the websites I develop and run. I'm looking at Content-Security-Policy headers and I notice those seem 'easier' for sites that have one task and one source of development like Camp Wireless and somewhat harder for sites that collect pages/scripts/materials over the years like idefix.net.

Although Camp Wireless can have some advertising, which suddenly turns the whole thing around since advertising scripts can load other advertising scripts completely dynamic. Searching for 'google adwords' and 'Content-Security-Policy' gave me Can Content Security Policy be made compatible with Google Analytics and AdSense? and the answer seems to be either "no" or "with a lot of work which you have to keep updating".

Update: I temporarily added a Content-Security-Policy-Report-Only directive to get an idea what kind of problems I will run into (with my own reporting backend). A lot of them. All inline javascript is suddenly a problem. So a 'fully secured' Content Security Policy header is already hard for single task, single source websites, let alone websites with a lot of history in the pages.

Tags: , , ,

IPv6 check

Running test...
, reachable as koos+website@idefix.net. PGP encrypted e-mail preferred. PGP key 5BA9 368B E6F3 34E4 local copy PGP key 5BA9 368B E6F3 34E4 via keyservers

RSS
Meningen zijn die van mezelf, wat ik schrijf is beschermd door auteursrecht. Sommige publicaties bevatten een expliciete vermelding dat ze ongevraagd gedeeld mogen worden.
My opinions are my own, what I write is protected by copyrights. Some publications contain an explicit license statement which allows sharing without asking permission.
Other webprojects: Camp Wireless, wireless Internet access at campsites, The Virtual Bookcase, book reviews
This page generated by $Id: newstag.cgi,v 1.39 2022/11/18 15:23:48 koos Exp $ in 0.040117 seconds.