Supply chain attacks on open source

Watch out if you are using libraries and code from public repositories. Supply chain attacks are (have been) on the rise.

The latest one is on Rust.

Fair use of web content

This news was buried among many other news, but I felt that it deserves more people knowing about it.

It is about “fair use” of publicly available web content. What is “fair use” and when can content be restricted.

The original article is here.

A small company called hiQ is locked in a high-stakes battle over Web scraping with LinkedIn. It’s a fight that could determine whether an anti-hacking law can be used to curtail the use of scraping tools across the Web.

HiQ scrapes data about thousands of employees from public LinkedIn profiles, then packages the data for sale to employers worried about their employees quitting. LinkedIn, which was acquired by Microsoft last year, sent hiQ a cease-and-desist letter warning that this scraping violated the Computer Fraud and Abuse Act, the controversial 1986 law that makes computer hacking a crime. HiQ sued, asking courts to rule that its activities did not, in fact, violate the CFAA.

James Grimmelmann, a professor at Cornell Law School, told Ars that the stakes here go well beyond the fate of one little-known company.

I will leave it up to you to read and make up your own opinion about it.

Using ldap and kerberos with ajaxplorer

12/18/12 Update: not all is peachy keen. Login and autocreate account works, but logout can be an issue. I need to clear the session cookie when someone logout. Have not gotten around to coding that yet.

After a bit of fiddling around, I finally got ajaxplorer working with (ldap) kerberos5 as the backend authentication/access.

We are using ldap for users directory and kerberos5 for password. It’s a little bit different than what I am used to.

Anyway, I needed to get ajaxplorer working on a large filer for users to be able to access — locally and remotely — essentially our private ‘dropbox’. But getting ajaxplorer working with kerberos was a bitch! At first, I tried using ldap, got that working…. except ldap does not have our password, that’s where kerberos comes in. I thought about writing my own plugin, but damn it, I don’t have time for this.

After lots of googling, experimenting, etc. I found mod_auth_pam, which uses pam for basic HTTP auth. And since we are already using pam_krb5 for logins on our boxes, it’s a perfect solution.

Here is the section in my bootstrap_plugins.php:

$PLUGINS = array(
        "CONF_DRIVER" => array(
                "NAME"          => "serial",
                "OPTIONS"       => array(
                        "REPOSITORIES_FILEPATH" => "AJXP_DATA_PATH/plugins/conf.serial/repo.ser",
                        "ROLES_FILEPATH"        => "AJXP_DATA_PATH/plugins/auth.serial/roles.ser",
                        "USERS_DIRPATH"         => "AJXP_DATA_PATH/plugins/auth.serial",
                        "FAST_CHECKS"           => false,
                        "CUSTOM_DATA"           => array(
                                        "email" => "Email",
                                        "country" => "Country"
        "AUTH_DRIVER" => array(
                "NAME"          => "basic_http",
                "OPTIONS"       => array(
                        "USERS_FILEPATH" => "AJXP_DATA_PATH/plugins/auth.pam/users.ser",
                        "AUTOCREATE_AJXPUSER"   => true,
                        "TRANSMIT_CLEAR_PASS"   => false
                "NAME"          => "serial",
                "OPTIONS"       => array(
                        "LOGIN_REDIRECT"        => false,
                        "USERS_FILEPATH"        => "AJXP_DATA_PATH/plugins/auth.serial/users.ser",
                        "AUTOCREATE_AJXPUSER"   => false,
                        "FAST_CHECKS"           => false,
                        "TRANSMIT_CLEAR_PASS"   => false
        "LOG_DRIVER" => array(
                "NAME" => "text",
                "OPTIONS" => array(
                        "LOG_PATH" => (defined("AJXP_FORCE_LOGPATH")?AJXP_FORCE_LOGPATH:"AJXP_INSTALL_PATH/data/logs/"),
                        "LOG_FILE_NAME" => 'log_' . date('m-d-y') . '.txt',
                        "LOG_CHMOD" => 0770

And the section in my /etc/httpd/conf.d/ajaxplorer.conf file:

   < Directory "/usr/share/ajaxplorer">
        Options FollowSymLinks
        AllowOverride Limit FileInfo
        Order allow,deny
        Allow from all
        AuthName "Ajaxplorer Access"
        AuthType Basic
        AuthPAM_Enabled on
        Require valid-user
  	php_value error_reporting 2
   < /Directory>

The trick is these two lines for the “basic_http” auth_driver:

"USERS_FILEPATH" => "AJXP_DATA_PATH/plugins/auth.pam/users.ser",

That then allow my users to login, and on first time, they auth via mod_auth_pam, and ajaxplorer create their account in “AJXP_DATA_PATH/plugins/auth.pam/users.ser”.

NOTE I have to manually create the directory plugins/auth.pam and create an empty users.ser file.

But after that, everything work perfectly.

dynamic robots.txt file in Rails 3.x

We have a need for dynamic handling of robots.txt file as we have different requirements for production, staging, dev, test, etc.

Google-fu shows various way to do this, some for Rails 2.x, some for Rails 3.x. Here is my version.

First is to edit config/routes.rb and add this line:

match '/robots.txt' => RobotsGenerator

Then add the following to app_root/lib/classes/robots_generator.rb.

NOTE: We have an old domain,, that redirects to our We don’t want to get indexed, so I have special treatment for that in production

class RobotsGenerator
  # Use the config/robots.txt in production.
  # Disallow everything for all other environments.
    req =
    headers = {}
    body = if Rails.env.production?
      if =~ /$/
        headers = { 'X-Robots-Tag' => "noindex,nofollow" }
        "User-agent: *\nDisallow: /"
      else Rails.root.join('config', 'robots.txt')
        "User-agent: *\nDisallow: /"

    [200, headers, [body]]
  rescue Errno::ENOENT
    [404, {}, "User-agent: *\nDisallow: /"]

Finally, you want to move public/robots.txt to config/robots.txt.

I want to give credits to the people that inspired my version.

%d bloggers like this: