Jupyter Widgets can have a separate frontend and backend component. Sometimes, you need to send a message from one to the other. This example shows the basics on sending a message from the backend to the frontend.
In your Widget’s frontend (JavaScript) code, listen for the custom event from the backend:
Jupyter Widgets can have a separate frontend and backend component. Sometimes, you need to send a message from one to the other. This example shows the basics on sending a message from the frontend to the backend.
In your Widget’s backend (Python) code, listen for the custom event from the frontend:
When writing a custom Jupyter Widget, sometimes you need to know the Cell (CodeCell, usually) that your Widget is running in. You can get a list of all Cells from the Notebook object in Javascript, but finding your Cell isn’t exactly that straight-forward.
Fortunately, it isn’t that hard to determine your Cell, since you can find the container element for your Widget, and then loop through all Cells in the Notebook to see which one you are in:
A very common file format for transferring data is the Comma (or TAB) Separated Value file. These formats appear simple, but have several complications that make implementing a CSV/TSV parser slightly more challenging than just breaking a row of text by its separator.
It’s probably best to start with some definitions before diving into any details. In this file format, which is a text file where each line of text is like a row in a spreadsheet, and each of those rows is made up of columns.
The generally accepted practice is to call the rows Records, and the columns Fields. The first line is typically included, but not required, and contains the Field names.
basic parsing
Fields are separated by their separator character, which is a comma for a CSV and a TAB for a TSV:
id,animal,sound
1,cat,Meow
2,dog,Bark
3,cow,Moo
EMBEDDED COMMAS
This works perfectly fine for a TSV, since the value in a Field probably wouldn’t contain a TAB character. But what about a CSV? It seems reasonable that the value in a Field could have one or more commas.
There’s no reasonable way to parse the Fields in this file (note that Field names in the first row, if included, are not required to match the Record Fields). The solution is to require double quotes around a Field that contains a comma:
Now we can parse this file, since the double quotes indicate the beginning and end of a Field. If we see a starting double quote, we can just keep reading until we see a closing double quote.
embedded double quotes
But what about double quotes themselves? It’s certainly common to see double quotes in different situations in a Field. Text in the field could be quoted, or a double quote could be used to mean inches. If we used the above logic of using an opening and closing double quote to know when a Field begins and ends, something like this will cause trouble:
The answer is to “escape” the double quotes in some way, and the decision was to use double double quotes. However, this causes its own problem, if we’re using double quotes to determine if commas are part of a Field, or if they are the separator between Fields, how can we tell? The answer, again, is to require double quotes around a Field that contains a double quote (or, more accurately now, an escaped double quote):
This use of escaping double quotes, and requiring double quotes around a Field that contains double quotes is common to both CSV and TSV.
embedded newlines or carriage returns
The final major thing to consider when parsing a CSV or TSV is that a Field could contain a newline or carriage return character:
id,name,description
1,shoe,these are fine shoes
2,round clock,features:
2 hands
round, 8"
maple wood
3,used laptop,technical specs:
2048 Neurobit 1098X3 CPU
12 Parsec pseudo-drive
15" screen
Since a newline (or carriage return) should indicate the end of the current Field and also the completion of the current Record, how do we deal with them as part of a Field? I’m sure you’ve guessed that double quotes are involved, and they are:
id,name,description
1,shoe,these are fine shoes
2,round clock,"features:
2 hands
round, 8""
maple wood"
3,used laptop,"technical specs:
2048 Neurobit 1098X3 CPU
12 Parsec pseudo-drive
15"" screen"
Notice that there are also escaped double quotes and commas in these Fields as well. The opening and closing double quotes allow for those in the Field.
other consideration
In general, this covers the major issues faced when parsing a CSV or TSV file. Note that a TSV can also allow for some special escaped characters:
Usually, when you get your SSL certificates, they are .crt, .key, and .ca-bundle files. These work fine for Apache’s HTTP server, but Apache’s Tomcat server needs these converted into a .jks (Java Key Store), and the Tomcat configuration set up to use that key store. To simplify the conversion, here is a shell script to perform the steps, under the assumption that the .crt, .key, and .ca-bundle files all have the same prefix.
#!/bin/sh
if [ "$1" = "" ]; then
echo ""
echo " usage: $0 <file-prefix> <password>"
echo ""
echo " This tool requires that all files have the same prefix, and the .crt, .key, and .ca-bundle files exist."
echo ""
echo " For example, if your files are named example.com.crt, example.com.key, example.com.ca-bundle, you would do:"
echo ""
echo " $0 example.com mySekretPasswd"
echo ""
exit 1
fi
echo ""
echo " Generating JKS file for $1..."
echo ""
echo "----------------------------------------------------------"
openssl pkcs12 -export -in $1.crt -inkey $1.key -name $1 -out $1.p12 -passout pass:$2
keytool -importkeystore -deststorepass $2 -destkeystore $1.jks -srckeystore $1.p12 -srcstoretype PKCS12 -srcstorepass $2
keytool -import -alias bundle -trustcacerts -file $1.ca-bundle -keystore $1.jks -storepass $2
prefix_alias=`keytool -list -v -keystore $1.jks -storepass $2 | grep -i alias | grep $1`
if [ "$prefix_alias" = "" ]; then
echo ""
echo " ** something seems to have gone wrong, $1 not found in aliases"
echo ""
exit 1
fi
echo "----------------------------------------------------------"
echo ""
echo " JKS file created."
echo ""
echo " Copy $1.jks to Tomcat's ssl directory, typically something like /etc/tomcat8/ssl/$1.jks"
echo ""
echo " Add or Update the <Connector> entries in Tomcat's server.xml to be something like:"
echo ""
echo " <Connector port=\"8443\" protocol=\"org.apache.coyote.http11.Http11NioProtocol\" maxThreads=\"150\" SSLEnabled=\"true\" scheme=\"https\" secure=\"true\" clientAuth=\"false\" sslProtocol=\"TLS\" keystoreFile=\"/etc/tomcat8/ssl/$1.jks\" keystoreType=\"JKS\" keystorePass=\"$2\" keyAlias=\"$1\" />"
echo " <Connector port=\"8009\" protocol=\"AJP/1.3\" redirectPort=\"8443\" />"
echo ""
An example of using the tool, if your certificate files all start with example.com:
Setting up an Ubuntu machine to act as an Apple Time Machine server is surprisingly simple. This example uses a directory on the boot drive (/srv/netatalk/time-machine), but it’s more likely that you want to use a directory on a large disk. Here are the steps…
Install the needed packages:
sudo apt install netatalk avahi-daemon
Edit the netatalk config file:
sudo vi /etc/netatalk/afp.conf
Add a section for your Time Machine:
[Time Machine]
path = /srv/netatalk/time-machine
time machine = yes
Create a directory to act as the Time Machine drive:
Now, on your Mac, you should be able to open the Time Machine settings in System Preferences and use Select Disk… to pick your new Time Machine backup drive.
Update for Ubuntu 20.04 and other notes
Since I first wrote this, I have updated to Ubuntu 20.04 and everything still seems to work. However, I never made it clear that you must make sure your backup drive is available and connected on your Mac before you can use it as a Time Machine backup drive.
Once your drive is set up under Ubuntu, go to your Mac and open a Finder window. Under the Network section in the sidebar, you should see your Ubuntu machine listed. Double-click on the machine name, and you should see any shared folders on the machine. You may have to click on the Connect button in the upper right of the window to login before you can use the drive.
Once you are logged in, you should then be able to use the drive with Time Machine.
Setting up Samba under Ubuntu 19.10 is relatively easy. This guide will show how to install Samba itself, then configure both a public drive meant to be shared among multiple users, and a per-user drive.
NOTE: This guide assumes your linux machine is on your local network.
First, install both samba and smbclient:
sudo apt install samba smbclient
Next, create a directory that will be the shared public drive, and set its ownership:
Now it’s time to configure Samba. There’s two basic things that need to be configured: setting the user security, and adding the public drive.
To set the user security, set security = user in the [global] section of /etc/samba/smb.conf.
Enable the per-user drive in /etc/samba/smb.conf:
[homes]
comment = Home Directories
browseable = no read only = no
To add the public drive, add this section to the end of /etc/samba/smb.conf:
[public]
comment = Public Files
path = /srv/samba/public
browsable = yes
guest ok = yes
read only = no
create mask = 0755
Now, restart the Samba services to pick up these configuration changes:
sudo systemctl restart smbd.service nmbd.service
Since Samba doesn’t use the linux login credentials for a user, you must add each user that needs access to a shared drive using the smbpasswd command:
sudo smbpasswd -a <unix username>
Also, if you’re running a firewall on your linux machine, you’ll probably have to allow access for your local network. You can allow specific machines, or a subnet. I use ufw to control my firewall configuration, so for me I simply allowed all access for my internal network:
sudo ufw allow from 192.168.0.0/16
To connect to a drive from Windows, I right-click on the Network item in File Explorer and select Map network drive..., and use \\<hostname>\public or \\<hostname>\<unix username> as the Folder.
To connect to a drive from the Mac, I use Go -> Connect to Server... in the Finder, then use smb://<hostname>/public or smb://<hostname>/<unix username> as the address.