Cacomania: Quickies

Cacomania

MongoDB with PHP "… BadValue $in needs an array"

Guido Krömer - 1096 days ago - Tags: ,

Recently I had a problem when using MongoDB with PHP using the $in operator which ends up with the following error: "Can't canonicalize query: BadValue $in needs an array".

I did a simple query and used the $in operator thousand times before, but what went wrong? Let's take a look at the simplified code below, an indexed array of tags is given to filter for blog posts in a collection, but before passing the tags to the query I removed invalid tags from the array using unset().

foreach ($tags as $key => $tag) {
    if (!valid($tag)) {
        unset($tags[$key]);
    }
}

$posts = $blogPostCollection->find(['Tag' => ['$in' => $tags]]);

Using unset was the problem, but why? When iterating over an array after unsetting a value everything seems to be right:

php > $ids = ['a', 'b', 'c', 'd'];
php > unset($ids[1]);
php > foreach ($ids as $id) { echo "$id "; }
a c d 

The problem is that unset() produces holes in an indexed array, c has still the index two… :

php > $ids = ['a', 'b', 'c', 'd'];
php > var_export($ids);
array (
  0 => 'a',
  1 => 'b',
  2 => 'c',
  3 => 'd',
)
php > unset($ids[1]);
php > var_export($ids);
array (
  0 => 'a',
  2 => 'c',
  3 => 'd',
)

One solution is using array_splice() instead of unset() which prevents holes in an index array:

php > $ids = ['a', 'b', 'c', 'd'];
php > array_splice($ids, 1, 1);
php > var_export($ids);
array (
  0 => 'a',
  1 => 'c',
  2 => 'd',
)

Or you could use array_values() on the array after an unset operation which creates an new array containing all values from the origin array.

php >$ids = ['a', 'b', 'c', 'd'];
php > unset($ids[1]);
php > var_export(array_values($ids));
array (
  0 => 'a',
  1 => 'c',
  2 => 'd',
)

I hope this little post helped you with this or a similar problem, feel free to leave a comment.

Comment

Unix timestamp conversion with date

Guido Krömer - 1296 days ago - Tags: ,

If you are not smart enough to interpret a Unix timestamp by yourself date can do it for you.

$ date -u -d @1234567890
Fri Feb 13 23:31:30 UTC 2009

The timestamp can be formatted, too.

$ date -u -d @1234567890 +"%Y-%m-%d %T"
2009-02-13 23:31:30

If you need a timestamp, %s lets date print one.

$ date +"%s"
1386708117

Comment

Simple AngularJS sum by key view filter

Guido Krömer - 1305 days ago - Tags: ,

This little view filter for AngularJS sums up the values of a spcific field from an array of objects.

This is quite useful if you want to show the total number of something at a table footer, for example.

angular.module('caco.feed.filter', [])
    .filter('sumByKey', function() {
        return function(data, key) {
            if (typeof(data) === 'undefined' || typeof(key) === 'undefined') {
                return 0;
            }
 
            var sum = 0;
            for (var i = data.length - 1; i >= 0; i--) {
                sum += parseInt(data[i][key]);
            }
 
            return sum;
        };
    });

Here is a small fiddle which shows how the filter works:

Let me know if you liked or disliked my little posting.

Comment [16]

Remove duplicate values from a mysql table

Guido Krömer - 1342 days ago - Tags:

I needed a way to clean up a MySQL table containing many duplicate rows because of a missing UNIQUE KEY.

One solution would be a script which searches and removes those dups, but depending on the number of rows this would be a heavy lifting job. A better solution is performing this task completely in the database without moving the data into an application and the sanitized data back to the database.

You do not need any PL/SQL…, five simple SQL statements can handle the task. A new table with the structure of the table containing the dubs, with a unique index preventing the dups, has to be created. The rows from the old table has to be inserted into the new one, by using the IGNORE keyword the query will not stop when reaching a duplicate entry, those dups gets just ignored. The last step is replacing the new table with the old one.

CREATE TABLE `table_without_dups` LIKE `my_table`;

ALTER TABLE `table_without_dups`
ADD UNIQUE `my_unique_key` (`col_1`, `col_n`);

INSERT IGNORE INTO `table_without_dups`
SELECT * FROM `my_table`;

DROP TABLE `my_table`;
RENAME TABLE `table_without_dups` TO `my_table`;

I hope my small posting helped you, feel free to leave a comment.

Comment [3]

Search all config files for a specific term

Guido Krömer - 1527 days ago - Tags: , ,

A common problem, which of all ini file contains the part loading module XYZ?

The find command below searches all ini files in the /etc folder for the term "mysqli".

$ find /etc -name "*.ini" -printf "echo \"FILE: %f\"\ngrep \"mysqli\" %h/%f\n" | bash

Comment

Monitor file copy progress with find and watch

Guido Krömer - 1570 days ago - Tags: ,

Currently copying a large amount of files and wants to see if the job is still running?

watch

find lists all files in the folder and his sub-folders, wc -l count the lines from stdout and watch executes this every two seconds.

$ watch "find ./ -type f | wc -l"

Comment

MySQL rename a database with bash

Guido Krömer - 1605 days ago - Tags: ,

Renaming an MySQL table is straightforward with RENAME TABLE, but renaming a whole database with RENAME DATABASE has been removed in MySQL 5.1.23 because it could be dangerous.

Since I needed this functionality some lines of bash does nearly the same by creating a new database and renaming each table like this: RENAME TABLE old_database.my_table new_database.my_table.

#!/bin/bash
execute_sql () {
  echo "$1" | mysql
}

rename_database () {
  FROM=$1
  TO=$2

  execute_sql "CREATE DATABASE IF NOT EXISTS $TO"

  TABLES=`execute_sql "SHOW TABLES FROM $FROM"`

  for TABLE in $TABLES; do
    execute_sql "RENAME TABLE $FROM.$TABLE TO $TO.$TABLE"
  done;

  echo "Done, drop the old database ($FROM) if needed."
}

rename_database $1 $2

Comment

Merging files together with the PowerShell

Guido Krömer - 1618 days ago - Tags: , ,

After playing a round with TypeScript, I needed a way for merging all those single files together. Performing this task with Bash is quite simple. Under Windows there is no Bash, but a thing called PowerShell. The PowerShell has got a large amount of Bash like aliases, which makes the move from Bash to PowerShel really simple, as you can see in the script below.

$TARGET_FILE = "Full.ts"

rm $TARGET_FILE 

$FILES = ls -Recurse -Filter "*.ts"

foreach ($FILE in $FILES) {
  cat $file.FullName >> $TARGET_FILE 
}

tsc $TARGET_FILE

Comment

Embedding a single file from Gist

Guido Krömer - 1633 days ago - Tags: ,

It seems GitHub removed the embed single file "Button" from Gist. Maybe they just moved it where I am unable to find it :D . This is really annoying since I had to inspect old sources blog posts to figure out how the query, for embedding a single file, has to look like.

So it's just a personal/public note for me.

<script src="https://gist.github.com/%NUMBER_OF_GIST%.js?file=%FILE_NAME%"/>

Comment [1]

Splitting an Image in PHP

Guido Krömer - 1652 days ago - Tags:

Some simple tasks ends up into a simple script. After trying to create a comparison image, which consists out of some screenshot taken at the same resolution with different application settings, details for example, I finally got mad creating this composed screenshot with GIMP.

Vertically splitted view on Safari, Opera and Firefox.

read more

Comment [2]

Some bash special variables you should never forget

Guido Krömer - 1666 days ago - Tags: ,

Since the main topic here in the last time was JavaScript here is a small foray into the world of bash special variables. Although there are a lot more special variables in bash this might be the most common in my opinion.

Getting all params

There are two variables containing all params, the first one is $* which represents all params as single quoted string and $@ where each word is quoted separate. If it's not clear what is going on there, let's have a look at the example below.

#!/bin/bash
ALL_ARGS_STAR=("$*")
echo STAR: ${ALL_ARGS_STAR[0]}
echo STAR: ${ALL_ARGS_STAR[1]}
ALL_ARGS_AT=("$@")
echo AT: ${ALL_ARGS_AT[0]}
echo AT: ${ALL_ARGS_AT[1]}

Accesing the variables ALL_ARGS_STAR and ALL_ARGS_AT using the array indexer should make the difference of $* and $@ more clear.

[caco@MacBook-Air ~]# ./params-list.sh Foo Bar 1337
STAR: Foo Bar 1337
STAR:
AT: Foo
AT: Bar

The other usefull ones

  • $0, which is the first param and holds the called script name including the path.
  • $# represents the number of arguments.
  • $1 .. $n access the param directly.
  • $$ gets the PID of the running script.

If you got confused by $0, $#, $1337 or $$ have a look at the self explaining script below:

#!/bin/bash
SCRIPT_NAME=`basename $0`
NUM_ARGS=$#
PID=$$
ARG_ONE=$1
ARG_TWO=$2
ALL_ARGS=$@

echo "Script name: $SCRIPT_NAME"
echo "Number of arguments: $NUM_ARGS"
echo "PID: $PID"
echo "Arg one: $ARG_ONE"
echo "Arg two: $ARG_TWO"

i=1
echo -e "Arg:\tVal:"
for ARG in $ALL_ARGS; do
  echo -e "$i\t$ARG"
  i=$(( $i + 1 ))
done

Here is the generated output:

[caco@MacBook-Air ~]# ./bash_params.sh Foo Bar 123
Script name: bash_params.sh
Number of arguments: 3
PID: 25052
Arg one: Foo
Arg two: Bar
Arg:    Val:
1   Foo
2   Bar
3   123

Comment

sprintf in JavaScript

Guido Krömer - 1681 days ago - Tags:

This is my implementation of a sprintf like function in JavaScript. It does not support any fancy number formatting and the replacement character (%) could be replaced, too. But sometimes it is just good enough.

String.prototype.sprintf = function () {
    var string = this;
    for (var i = 0; i < arguments.length; i++) {
        string = string.replace('%', arguments[i]);
    }
    return string;
}

My sprintf is a method of the string "class", so the call would be myString.spintf(1, 2, 3) and not like in c: sprintf(myVar, 1, 2, 3):

console.log('a: % b: % c: % d: %'.sprintf('A'));
console.log('a: % b: % c: % d: %'.sprintf('A', 'B'));
console.log('a: % b: % c: % d: %'.sprintf('A', 'B', 'C'));
console.log('a: % b: % c: % d: %'.sprintf('A', 'B', 'C', 'D'));

Here is the output produced by the code above:

$ node sprintf.js 
a: A b: % c: % d: %
a: A b: B c: % d: %
a: A b: B c: C d: %
a: A b: B c: C d: D

So have fun using and modifying it.

Comment

Use a QEMU qcow2 hard disk image with VirtualBox

Guido Krömer - 1716 days ago - Tags: , , ,

Last week I had to use an existing virtual machine on my local computer, which is a mac. The origin host machine was running Linux with KVM for visualization. The disk image format is qcow2, which is the default format in QEMU and not supported by VirtualBox. But the qemu-img tool can convert a qcow2 image to VirtualBox’s vdi format.

# qemu-img convert -O vdi vda.img vda.vdi

After converting the format, the new disk image, named “vda.vdi”, can be used with VirtualBox.

Comment

Creating a iso image with dd and Mac OS X

Guido Krömer - 1728 days ago - Tags: ,

Thank god, dd works on Mac OS X, too. Just login as root unmount the eventually automounted disc and use dd as usual.

[caco@MacBook-Air ~]# sudo -s
Password:
[root@MacBook-Air ~]# mount
/dev/disk1 on / (hfs, local, journaled)
devfs on /dev (devfs, local, nobrowse)
map -hosts on /net (autofs, nosuid, automounted, nobrowse)
map auto_home on /home (autofs, automounted, nobrowse)
/dev/disk2s0 on /Volumes/XP_PRO_SP3 (cd9660, local, nodev, nosuid, read-only, noowners)
[root@MacBook-Air ~]# umount /dev/disk2s0
[root@MacBook-Air ~]# dd if=/dev/disk2s0 of=Documents/MY_IMAGE.iso

Comment

Remove the phassprase from a ssh keyfile

Guido Krömer - 1730 days ago - Tags: ,

Just a command you maybe need sometimes when a shh keyfile with a phassprase becomes annoying. You will still need the actual phassprase(123456 in the example below) to remove it.

# ssh-keygen -p -P 123456 -N "" -f /home/user/.ssh/id_rsa

Comment

Prevent impatient users to click a link several times

Guido Krömer - 1745 days ago - Tags: ,

  • Fact 1: Users are very impatient.
  • Fact 2: Some pages have a really long request time.
  • Fact 3: Users try to improving the request time by clicking several times the same link.

This can be bypassed by disabling the links on a page when they got clicked.

$('.one_click_link').click(function() {
    $(this).click(function() {
        return false;
    });
})

This solution has one disadvantage, if the user decides to click another link and reverts his decision the already clicked links are still disabled. This is the reason I used the class "one_click_link" which will be assigned to "problematic" links, instead of giving all links on a page this availability using $('a')...

Comment [1]

MongoDB: manipulating values if a "update" is not enough

Guido Krömer - 1750 days ago - Tags: , ,

Sometimes the MongoDB update function does not fit the requirements. Since JavaScript can be executed directly in the database (if it's not disabled) it can be used for more complex updates, for example. This happens without moving the data to the application and back to the database.

This small example does the same as: myColl.update({myValue: 1}, {$inc: {myValue: 1}})

db.myCollection.find({myValue: 1}).forEach(function(doc){ 
    var newValue = doc.myValue + 1;
    db.site.update({_id: doc._id}, {$set: {myValue: newValue}})  
})

This is a concrete example where tags have been saved as a comma separated string. Creating an index, for fast searching in the tags, on this field would not make sense. Converting this string field into an array of tags should solve the problem. This piece of code does this job for each site which has been tagged, for speeding up only the tags field and MongoDB's _id field gets queried from those documents.

db.sites.find({tags: { $exists: true }}, {tags: 1}).forEach(function(doc){ 
   db.site.update({_id: doc._id}, {$set: {tags : doc.tags.split(',')}})  
})

Comment

Merging files together

Guido Krömer - 1757 days ago - Tags: ,

Having a website CSS or JS separated into different files and sorted in sub folders is great for development, but ends up in many requests during live operation. Instead of using a “complex” script which merges those files together this one liner can do the job, too.

# find /path/to/css/* -type f -name *.css | xargs cat > /path/to/css/screen.css

Comment [1]

Auto cleaning up your download folder with find

Guido Krömer - 1778 days ago - Tags: ,

Is your download folder messy, too? Why not let find tidy it up periodically?

# find ~/Downloads/* -ctime +7 -exec rm -rf {} \;

The command above let find search for files and folder which are older than 7 days and deletes them. But be carefully files in a folder will be deleted even if the file age is newer than 7 days if the folder is older than 7 days!

0 * * * * find ~/Downloads/* -ctime +7 -exec rm -rf {} \; > ~/Downloads/deleted.files

Running this as a cronjob will do the work once a hour, or how often you wants.

Comment

FTP upload via cURL

Guido Krömer - 1781 days ago - Tags: , , ,

Uploading a file to a ftp server with cURL is really easy:

# curl -u user:pass -T my_file ftp://example.com/target/path/

Or with a little potion of bash for uploading multiple file:

# for file in ./*; do curl -u user:pass -T "$file" ftp://example.com/target/path/; done;

Comment