newspaint

Documenting Problems That Were Difficult To Find The Answer To

Category Archives: Technology

OpenWRT/LEDE Buffalo WZR-HP-AG300H Getting 5GHz Radio Working

I had a problem with my Buffalo WZR-HP-AG300H, I couldn’t get the 5GHz radio wireless interface working along with the 2.5GHz radio.

In the end I used the following /etc/config/wireless configuration:

config wifi-device 'radio0'
        option type 'mac80211'
        option phy 'phy0'
        option txpower '7'
        option country 'GB'
        option hwmode '11g'
        option channel '7'
        option htmode 'HT20'

config wifi-device 'radio1'
        option type 'mac80211'
        option phy 'phy1'
        option txpower '9'
        option country 'GB'
        option hwmode '11a'
        option channel '120'
        option htmode 'HT40'

config wifi-iface
        option device 'radio0'
        option mode 'ap'
        option ssid 'my2500KHz'
        option network 'wlan'
        option encryption 'psk2'
        option key 'password'
        option wmm '0'

config wifi-iface
        option device 'radio1'
        option mode 'ap'
        option ssid 'my5GHz'
        option network 'wlan'
        option encryption 'psk2'
        option key 'password'
        option wmm '0'

This configuration seemed to work for me only after I rebooted the router.

LXC Container Reports PTY allocation request failed on channel 0 On SSH Connection

I tried upgrading my LXC from Ubuntu Trusty 14.04 by running sudo apt-get install lxc because, by default, the lxc package was not being upgraded.

But I then had problems getting consoles/terminals with my existing LXC containers.

This problem exhibits itself when attempting to ssh to a LXC container with the following message:

# ssh ubuntu@10.0.3.201
ubuntu@10.0.3.201's password: 
PTY allocation request failed on channel 0

It also exhibits itself when attempting to lxc-console a LXC container:

# sudo lxc-console -n mycontainer
lxc-console: commands.c: lxc_cmd_console: 722 Console -1 invalid, busy or all consoles busy.

(although a workaround is to connect using sudo lxc-console -n mycontainer -t 0).

The issue is that every container config file needs to have some extra lines added:

# required for lxc-console to work
lxc.tty = 4

# requires for interactive SSH to work
lxc.pts = 1024

One other issue I came across was that I would get the following errors when trying to start a container:

# sudo lxc-start -F -n mycontainer
Failed to mount cgroup at /sys/fs/cgroup/systemd: Permission denied
[!!!!!!] Failed to mount API filesystems, freezing.
Freezing execution.

This was bypassed by adding the following to the container’s config file:

# disable apparmour restrictions on container
lxc.aa_profile = unconfined

VLC on Ubuntu 16.04 with NVidia Graphics Card – Divx Video Playback Blank

I attempted to play a divx-encoded video on VLC 2.2.2 running on Ubuntu 16.04 LTS. The video was blank, although if, while the video was playing, I selected Video > Video Track > Disable, then select Video > Video Track > Track 1 from the menu it would display the current frame as a still image.

I opened up the messages window by selecting Tools > Messages from the menu. I then altered the Verbosity from 0 (errors) to 1 (warnings). Then I pressed play on the video for a short period to capture the warnings:

It displayed messages like:

avi warning: multiple riff -> OpenDML ?
avi warning: detected OpenDML file
avcodec info: Using NVIDIA VDPAU Driver Shared Library 384.59 Wed Jul 19 23:45:51 PDT 2017 for hardware decoding.
avcodec warning: cannot decode one frame (337 bytes)
core warning: VoutDisplayEvent 'pictures invalid'
core warning: VoutDisplayEvent 'pictures invalid'
avcodec warning: cannot decode one frame (337 bytes)
avcodec warning: cannot decode one frame (190 bytes)
avcodec warning: cannot decode one frame (190 bytes)

This led me to thinking the VDPAU driver was maybe failing.

A simple fix (although possibly not efficient) is select Tools > Preferences from the menu, select “Input / Codecs” from the icons at the top of the Simple Preferences dialog box, and change the first option, “Hardware-accelerated decoding” from “Automatic” to “Disable”. Then clicking “Save” at the bottom of the dialog box.

USB Tethering From CyanogenMod Android to Ubuntu Trusty 14.04

My laptop could not connect to the hotel’s WiFi but my mobile phone could. So I went into my phone settings, selected “…More”, selected “Tethering & portable hotspot”, and enabled “USB tethering”. This was while my phone was configured to be in “charge only” mode on USB.

My phone was attached to my Ubuntu computer by USB cable. And if I clicked on the Network Manager applet on my start bar (using Xubuntu) it showed me the option of “Ethernet Network (my phone model)” but it was greyed out. So Ubuntu had detected the phone had tethering turned on but wasn’t able to connect to it.

Automatic Option

Click on the Network Manager applet. At the bottom of the menu choose “Edit”.

Press “Add” to add a network connection.

Choose a connection type of “Ethernet” from the drop-down and press the “Create…” button.

Give the connection a name, e.g. “Tethering My Phone USB”. Select your USB interface from the drop down list of “Device MAC address” on the “Ethernet” tab (which is opened by default).

Choose “Save…” and the tethered network should automatically begin to work.

Manual Option (if all else fails)

The solution was to open a terminal and run:

$ sudo ifconfig usb0 up
$ sudo dhclient usb0

Now I had an IP address assigned to my usb0 interface and a default route.

ZFS Grub Issues on Boot

I had a problem when attempting to boot into my ZFS root and landed in initramfs rescue prompt.

Using advice from this article:

Command: zpool import -N
Message: cannot import '': no such pool available
Error: 1

Manually import the root pool at the command prompt and then exit.
Hint: Try:  zpool import -f -R / -N


BusyBox v1.22.1 (Ubuntu 1:1.22.0-15ubuntu1) built-in shell (ash)
Enter 'help' for a list of built-in commands.

(initramfs) zpool import -f -R / -N rpool
(initramfs) exit

Begin: Setting mountpoint=/ on ZFS filesystem  ... done
Begin: Mounting ZFS filesystem  ... done
Command: mount -t zfs -o zfsutil  /root
Message: filesystem '' cannot be mounted, unable to open the dataset
mount: mounting  on /root failed: No such file or directory
Error: 1

Manually mount the root filesystem on /root and then exit.


BusyBox v1.22.1 (Ubuntu 1:1.22.0-15ubuntu1) build-in shell (ash)
Enter 'help' for a list of built-in commands

(initramfs)

To fix this issue I ran:

(initramfs) zpool import -R /root rpool
(initramfs) exit

Unfortunately there is a known bug in Ubuntu 16.04.1 grub-probe command which states “error: unknown filesystem” when running update-grub.

The work-around is to update /etc/default/grub and add:

GRUB_CMDLINE_LINUX_DEFAULT="boot=zfs root=ZFS=rpool/ROOT"

This results in the grub menu specifying the root parameter twice on the kernel boot line in /etc/grub/grub.cfg but the second one takes precedence.

Monitor No Signal on Xubuntu 16.04.1

My Dell server seemed to stop outputting to the monitor on the VGA cable. No signal, the monitor said. It was blank, it was black, it was powered off. I tried unplugging the cable and plugging it back in, no joy.

I tried pressing ctrl-alt-1 to switch to the text console, and the screen came alive, but all I could see was a flashing underline of a cursor in the upper left-hand corner, no login prompt. Same thing for ctrl-alt-2. Tried ctrl-alt-7 to get back to graphics mode and the monitor turned off again.

The following repaired the issue for me without having to reboot, but it did kill my GUI session and all open windows:

sudo /etc/init.d/lightdm restart

My monitor came back alive and I found myself at the GUI XFCE login prompt.

Checking SSL Certificate Expiry on Remote Server using PowerShell

Overview

There are a number of approaches to take to get the expiry time of the SSL certificate on a remote server using PowerShell. This tutorial will be conducted using PowerShell 2.0 and .NET 3.5 for maximum compatibility (as there are some organisations out there still using Microsoft Windows 2003).

The Simple Way

If you’re reasonably assured your remote server exists and you have connectivity to it then you can write a simple script to:

  • make a TCP connection to the SSL port of the host you wish to check
  • obtain a SSL stream from the TCP connection
  • SSL authenticate as a client
  • obtain the X509 certificate of the remote server from the SSL stream
  • obtain the NotAfter field from the X509 certificate

That script is as follows:

Set-StrictMode -Version 2.0

#Requires -Version 2.0

$HostName = "www.google.com"
$Port = 443

# get TCP connection
[System.Net.Sockets.TcpClient]$TcpClient = $null
$TcpClient = New-Object "System.Net.Sockets.TcpClient"
try {
    $TcpClient.Connect( [System.String]$HostName, [System.Int32]$Port )
} catch {
    Throw "TCP connection error: $_"
}

# get SSL stream from TCP connection
[System.Net.Security.SslStream]$SslStream = $null
$SslStream = $TcpClient.GetStream()

# authenticate SSL stream
try {
    $SslStream.AuthenticateAsClient( $HostName )
} catch {
    Throw "Failed to authenticate SSL stream: $_"
}

# get X509 certificate
[System.Security.Cryptography.X509Certificates.X509Certificate]$cert = $null
$cert = $SslStream.RemoteCertificate

# get X509 certificate with extra properties
[System.Security.Cryptography.X509Certificates.X509Certificate2]$cer2 = $null
$cer2 = New-Object "System.Security.Cryptography.X509Certificates.X509Certificate2" -ArgumentList $cert

# output expiry
$cer2.NotAfter

# close stream and connection
$SslStream.Close()
$TcpClient.Close()

Implementing Timeouts

The fact is that some operations will take a long time when things go wrong. In the code above there are two moments things can block for a long time: making a TCP connection (if the remote end is not responding or the firewall is consuming network traffic), and authenticating the SSL stream (when, for example, the connected service is not SSL and doesn’t response to the authentication process).

In PowerShell we can use the Begin/End form of operations and wait up to a specified number of milliseconds (time) before we give up. The code to do that follows:


Set-StrictMode -Version 2.0

#Requires -Version 2.0

$HostName = "www.google.com"
$Port = 443

# get TCP connection
[System.Net.Sockets.TcpClient]$TcpClient = $null
$TcpClient = New-Object "System.Net.Sockets.TcpClient"
[System.IAsyncResult]$IAsyncResult = $TcpClient.BeginConnect(
    [String]$HostName,
    [System.Int32]$Port,
    $null, # AsyncCallback
    $null # user-defined Object
)

[System.Threading.ManualResetEvent]$AsyncWaitHandle = $null
$AsyncWaitHandle = $IAsyncResult.AsyncWaitHandle

[System.Boolean]$Wait = $AsyncWaitHandle.WaitOne( 5000 ) # 5s timeout

if ( $Wait ) {
    # object was signalled, i.e. connect finished or errored
    try {
        $TcpClient.EndConnect( $IAsyncResult )
        if ( -not $TcpClient.Connected ) {
            Throw "TCP connection not connected!"
        }
    } catch {
        Throw "TCP connection error: $_"
    }
} else {
    # timeout
    $TcpClient.Close() # can't wait for EndConnect, so destroy client
    Throw "TCP connection TIMEOUT"
}

# get SSL stream from TCP connection
[System.Net.Security.SslStream]$SslStream = $null
$SslStream = $TcpClient.GetStream()

# authenticate SSL stream
[System.IAsyncResult]$IAsyncResult = $SslStream.BeginAuthenticateAsClient(
    [String]$HostName,
    $null, # AsyncCallback
    $null # user-defined Object
)

[System.Threading.ManualResetEvent]$AsyncWaitHandle = $null
$AsyncWaitHandle = $IAsyncResult.AsyncWaitHandle

[System.Boolean]$Wait = $AsyncWaitHandle.WaitOne( 5000 ) # 5s timeout

if ( $Wait ) {
    # object was signalled, i.e. authenticate finished or errored
    try {
        $SslStream.EndAuthenticateAsClient( $IAsyncResult )
    } catch {
        Throw "SSL authentication error: $_"
    }
} else {
    # timeout
    $SslStream.Close() # can't wait for authenticate, so destroy stream
    $TcpClient.Close() # close TCP connection
    Throw "SSL authentication TIMEOUT"
}

# get X509 certificate
[System.Security.Cryptography.X509Certificates.X509Certificate]$cert = $null
$cert = $SslStream.RemoteCertificate

# get X509 certificate with extra properties
[System.Security.Cryptography.X509Certificates.X509Certificate2]$cer2 = $null
$cer2 = New-Object "System.Security.Cryptography.X509Certificates.X509Certificate2" -ArgumentList $cert

# output expiry
$cer2.NotAfter

# close stream and connection
$SslStream.Close()
$TcpClient.Close()

Not Requiring Validation of SSL Certification

So, you want to check a SSL certificate’s expiry date, and you don’t really care what the name is on the remote server certificate. You will be getting validation errors by now, like the following:

Exception calling "AuthenticateAsClient" with "1" argument(s): "The remote certificate is invalid according to the validation procedure."

You replace the following lines of code:

# get SSL stream from TCP connection
[System.Net.Security.SslStream]$SslStream = $null
$SslStream = $TcpClient.GetStream()

with:

# get SSL stream from TCP connection
[System.Net.Security.SslStream]$SslStream = $null
$SslStream = New-Object System.Net.Security.SslStream(
    $TcpClient.GetStream(),
    $True,
    [System.Net.Security.RemoteCertificateValidationCallback]{ $true }
)

This works fine on the first code example given above without timeouts.

But for the asynchronous code with timeouts this attempt to bypass certificate validation gives the error:

SSL authentication error: Exception calling "EndAuthenticateAsClient" with "1" argument(s): "There is no Runspace available to run scripts in this thread. You can provide one in the DefaultRunspace property of the System.Management.Automation.Runspaces.Runspace type. The script block you attempted to invoke was:  $true "

Okay things are quickly becoming rather tricky rather fast. The issue has been explained elsewhere as:

Asynchronous callback delegates are not a friend to PowerShell. They are serviced by the .NET threadpool which means that if they point to script blocks, there will be no Runspace available to execute them. Runspaces are thread-local resources in the PowerShell threadpool. The .NET threadpool, operating independently, is not too interested in coordinating callbacks with PowerShell. So what do we do?

We’re basically forced to drop into C#/.NET world whether we like it or not. So we might as well provide our own simple class that creates the appropriate callback function.

Add-Type @'
public class MyNoValidate {
  private static System.Boolean bypassvalidation(
    System.Object sender,
    System.Security.Cryptography.X509Certificates.X509Certificate certificate,
    System.Security.Cryptography.X509Certificates.X509Chain chain,
    System.Net.Security.SslPolicyErrors sslPolicyErrors
  ) {
    return true;
  }

  public static System.Net.Security.RemoteCertificateValidationCallback getcallback() {
    System.Net.Security.RemoteCertificateValidationCallback cb;

    cb = new System.Net.Security.RemoteCertificateValidationCallback(
      bypassvalidation
    );

    return cb;
  }
}
'@

and then:

# get SSL stream from TCP connection
[System.Net.Security.SslStream]$SslStream = $null
[System.Net.Security.RemoteCertificateValidationCallback]$Callback = $null
$Callback = [MyNoValidate]::getcallback()
$SslStream = New-Object System.Net.Security.SslStream(
    $TcpClient.GetStream(),
    $True,
    $Callback
)

Now you can get your SSL certificate without having to know the name on the certificate first – with timeouts, too!

Final Note

When getting the expiry time of a SSL certificate please avoid (don’t use) the System.Security.Cryptography.X509Certificates.X509Certificate2.GetExpirationDateString() method! You cannot be sure what you’re getting – whether the date is in USA format or the rest of the world format, or local or UTC time. Much, much better to use the System.Security.Cryptography.X509Certificates.X509Certificate2.NotAfter property of type System.DateTime.

Using HTML::Mason With CGI Provider

So you want to use HTML::Mason (version 1) but your web provider gives you cPanel-like access to CGI scripting only?

Download HTML::Mason from CPAN and extract the contents from the /lib directory into your account, say, into a directory called /lib/perl/mason.

Then create a file, /public_html/cgi-bin/mason_handler.cgi, which contains:

#!/usr/bin/perl

use lib $ENV{"DOCUMENT_ROOT"} . "/../lib/perl/mason";
use HTML::Mason::CGIHandler;

my $h = HTML::Mason::CGIHandler->new(
  data_dir => '/tmp/mason_data',
  allow_globals => [qw(%session $u)],
);

$h->handle_request;

Now you want to configure your Apache to use this handler for Perl Mason webpages in the /public_html/mason directory (Apache v2.2):

<Directory /public_html/mason>
  <FilesMatch "\.html$">
    Action html-mason /cgi-bin/mason_handler.cgi
    SetHandler html-mason

    # for Apache 2.2
    Order allow,deny
    Allow from all

    # for Apache 2.4 (see https://httpd.apache.org/docs/2.4/upgrading.html)
    #Require all granted
  </FilesMatch>

  <FilesMatch "^(autohandler|dhandler)$">
    Action html-mason /cgi-bin/mason_handler.cgi
    SetHandler html-mason

    # for Apache 2.2
    Order allow,deny
    Allow from all

    # for Apache 2.4 (see https://httpd.apache.org/docs/2.4/upgrading.html)
    #Require all granted
  </FilesMatch>
</Directory>

Some CGI website providers require additional Perl modules for HTML::Mason to work, these can all be downloaded and extracted from CPAN:

  • Exception/Class.pm
  • Devel/StackTrace.pm
  • Class/Container.pm
  • Class/Data/Inheritable.pm
  • Params/Validate.pm *
  • Params/ValidatePP.pm *

(the files marked with a * are those that can be downloaded from CPAN and use the command perl Makefile –pm to force native perl code generation).

Using PowerShell 2.0 With Selenium to Automate Internet Explorer, Firefox, and Chrome

PowerShell 2.0 on Windows XP/7 uses .Net 3.5 so the first thing to do is download the Selenium WebDriver.dll file from Selenium’s download page and extract the net35/ directory.

Internet Explorer

Next you want to obtain the Internet Explorer driver from this site. I recommend version 2.41 because “as of 15 April 2014, IE 6 is no longer supported”. This must reside in your current PATH so in your script you may want to modify your PATH to ensure the executable (IEDriverServer.exe) can be found there. If you’re wondering whether to get the 32-bit or the 64-bit version, start with the 32-bit even if you’ve got a 64-bit Windows.

At this point you’ll want to quickly instantiate Internet Explorer and navigate somewhere. Great. Let’s do it.

# Load the Selenium .Net library
Add-Type -Path "N:\selenium\WebDriver.dll" # or wherever your WebDriver.dll is

# Set the PATH to ensure IEDriverServer.exe can found
$env:PATH += ";N:\selenium"

# Instantiate Internet Explorer
$ie_object = New-Object "OpenQA.Selenium.IE.InternetExplorerDriver"

This outputs:

New-Object : Exception calling ".ctor" with "0" argument(s): "Request for the permission of type 'System.Net.SocketPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed."
At line:1 char:17
+ $ie = New-Object <<<<  "OpenQA.Selenium.IE.InternetExplorerDriver"
    + CategoryInfo          : InvalidOperation: (:) [New-Object], MethodInvocationException
    + FullyQualifiedErrorId : ConstructorInvokedThrowException,Microsoft.PowerShell.Commands.NewObjectCommand

Wait, what’s this? I don’t know. I just don’t know. It will happen if the DLL is on a network drive and not marked as “trusted” (whatever that means). So copy the DLL onto a local hard drive and try again.

# Load the Selenium .Net library
Add-Type -Path "C:\selenium\WebDriver.dll" # put your DLL on a local hard drive!

# Set the PATH to ensure IEDriverServer.exe can found
$env:PATH += ";N:\selenium"

# Instantiate Internet Explorer
$ie_object = New-Object "OpenQA.Selenium.IE.InternetExplorerDriver"

Great! Now we have an Internet Explorer window appear. We can navigate to a new URL:

$ie_object.Navigate().GoToURL( "http://www.bbc.co.uk/languages" )

This worked! The call won’t return until the page download is complete.

Next let’s click on a link from the link text:

$link = $ie_object.FindElementByLinkText( "Spanish" )
$link.Click()

# display current URL
$ie_object.Url

FireFox

Let’s try it with FireFox now. We require the GeckoDriver from the Selenium downloads page. Note that there is no GeckoDriver support for Windows XP at all.

# Set the PATH to ensure geckodriver.exe can found
$env:PATH += ";N:\selenium"

$ff_object = New-Object "OpenQA.Selenium.Firefox.FirefoxDriver"

Chrome

Finally let’s try with Google Chrome. We require the ChromeDriver from the Selenium downloads page.

# Set the PATH to ensure chromedriver.exe can found
$env:PATH += ";N:\selenium"

$chrome_object = New-Object "OpenQA.Selenium.Chrome.ChromeDriver"

Using wget to Automate Logging Into Websites

The open-source wget tool is useful for automating website access/scraping. In particular because it can store/retrieve cookies from a file.

# create a name for the cookie jar/file
COOKIE_JAR=/tmp/cookies.$$.txt

# save cookies from homepage access
wget --spider --save-cookies $COOKIE_JAR --keep-session-cookies http://www.smrt.com.sg/

# now submit request using saved cookies
wget -O - \
  --load-cookies $COOKIE_JAR \
  --save-cookies $COOKIE_JAR \
  --keep-session-cookies \
  --header "Referer: http://journey.smrt.com.sg/" \
  --post-data='startlat=1.357348601&startlng=103.9884093&endlat=1.276243657&endlng=103.8545958&routeopt=fastest&start_type=mrt&end_type=mrt&mode=TRANSIT&use_lrt=yes' \
  https://connect.smrt.wwprojects.com/smrt/api/journey/

Note that –spider performs a HEAD request and does not download the response. Options useful for debugging and seeing what is sent/received are -d and -S. For cookies the –keep-session-cookies option is essential to save session cookies (with no expiry time set) to the cookie file.