Newsbeuter is an open source RSS feed reader for terminals.
It's great for quickly consuming lots of information from websites. One nice feature is that it allows you to 'bookmark' specific links, and that's where LinkiePie comes in.
LinkiePie is a personal project of mine for storing and organising links, somewhat like Pocket and Instapaper, and I would like to send my links directly from newsbeuter to it.
It has a basic API for externally accessing some functionality and accounts are automatically generated API keys. The following is a quick guide for sending your bookmarked newsbeuter links straight to your LinkiePie account.

The first step, of course, is to get a LinkiePie account. When you have registered check out https://linkiepie.com/settings/ to get your API key. This is all you need to programmatically add and retrieve your links with LinkiePie.

Next we will setup a simple script that will interface with newsbeuter. Save the following code in a convenient directory or download it from here.
#!/bin/bash
URL="$1"
USERNAME=YOUR_USERNAME
API_KEY=YOUR_KEY
curl -k -H "Content-Type: application/json" -X POST --data "{\"url\": \"${URL}\"}" "https://linkiepie.com/api/v1/links/?username=${USERNAME}&api_key=${API_KEY}"

Make sure to change YOUR_USERNAME and YOUR_KEY to your own details.
Assuming you have the script in a file called linkiepie-bookmark.sh and have placed it in your home directory, the next step is to update your newsbeuter config file (.newsbeuter/config) with the following:
bookmark-cmd "~/linkiepie-bookmark.sh"
bookmark-autopilot yes

The final step is to set the executable bit on the script so newsbeuter can execute it.
chmod +x linkiepie-bookmark.sh

Restart newsbeuter and try it out by pressing Ctrl+B with an article highlighted or selected. You should then see that article in your LinkiePie account along with an archived version of the important text.

You can retrieve your data from LinkiePie via the API as well. The following request will return the the url, title and extracted text from the links you've added.
https://linkiepie.com/api/v1/links/?username=YOUR_USERNAME&api_key=YOUR_KEY&format=json&limit=10&offset=0

The number of results returned for each request is controlled by the limit parameter and adjusting the offset parameter, which specifies where in the list of results to start from, will allow you to move between these sets of results.
This was posted on Tue 16 Dec 2014 (4 years, 2 months ago) by Ryan McConville
I was thinking about how the Bing Desktop application manages to retrieve the Bing image each day, so I used Wireshark to monitor the application as it runs. I noticed that it retrieves an XML document from http://www.bing.com/hpimagearchive.aspx?format=json&idx=0&n=1&mbl=1&mkt=en-ww, which provides the URL for the image. Using this I updated my script I originally posted here. It also looks like we can retrieve the higher resolution 1920x1200 image with this URL from outside the USA too.

#!/usr/bin/python
import urllib2
import os
import sys
import shutil
import argparse

from xml.dom import minidom


def main(save_location):
bing_url = 'http://www.bing.com'
xml_url = 'http://www.bing.com/hpimagearchive.aspx?format=json&idx=0&n=1&mbl=1&mkt=en-ww'

xml = urllib2.urlopen(xml_url)
xml_doc = minidom.parse(xml)
url_base = xml_doc.getElementsByTagName('urlBase')[0].firstChild.nodeValue
extension = xml_doc.getElementsByTagName('url')[0].firstChild.nodeValue.split('.')[-1]
image_url = bing_url + url_base + '_1920x1200.' + extension

try:
img = urllib2.urlopen(image_url)
except urllib2.HTTPError as e:
print(e)
sys.exit(-1)

image_name = image_url.split('/')[-1]

if not os.path.exists(save_location):
os.makedirs(save_location)

if save_location[-1] != '/' or save_location[-1] != '\\':
save_location += '/'

with open(save_location + image_name, 'wb') as f:
shutil.copyfileobj(img, f)


if __name__ == "__main__":
script_dir = os.path.dirname(os.path.realpath(__file__))
parser = argparse.ArgumentParser(description='Download todays Bing image.')
parser.add_argument('-d', '--dir', dest='save_location', action='store',
default=script_dir,
help='Directory to store the downloaded image (default: directory of this program)')

results = parser.parse_args()

main(results.save_location)


You can also find the repository at https://bitbucket.org/ryanmcconville/bing-daily-image-retriever
This was posted on Tue 24 Jun 2014 (4 years, 8 months ago) by Ryan McConville
Tags: code
Continuing in my effort to create blog posts of old bits of code I can find lying around on my computer and internal git server, I decided to post this small C++ project. It logs the percentage of battery left at specific intervals into a SQLite3 database. Again, this is one of my first C++ projects so I'm not exactly confident it's done to a high standard. :)

Nonetheless, the code is included below and a the git repository on bitbucket can be found here: https://bitbucket.org/ryanmcconville/windows-battery-stats/

#include "stdafx.h"
#include "iostream"
#include "sstream"
#include "time.h"
#include "vector"
#include "iostream"
#include "Windows.h"
#include "fstream"
#include "Shlobj.h"
#include "stdio.h"
#include "WbemCli.h"
#include "sqlite3.h"

using namespace std;

//get the users home directory
wstring GetHomeDirectory(){
PWSTR path;
SHGetKnownFolderPath(FOLDERID_Profile, 0, NULL, &path);
return path;
}

void WriteDefaultConfiguration(char *defaulttime){
ofstream config_file;
config_file.open("config.ini", ios::out);
config_file.write(defaulttime, strlen(defaulttime));
config_file.close();
}

int ConvertToMilliseconds(int interval){
return ((interval * 60) * 1000);
}

int ReadConfigurationFile(){
char defaulttime[7] = "10";
char line[7];
ifstream config_file;
config_file.open("config.ini",ios::in);
if(!config_file.is_open()){
WriteDefaultConfiguration(defaulttime);
config_file.open("config.ini",ios::in);
if(!config_file.is_open()){
return ConvertToMilliseconds(atoi(defaulttime));
}
}
config_file.get(line, 10);
int interval;
interval = atoi(line);
int interval_milli = ConvertToMilliseconds(interval);
return interval_milli;
}

sqlite3* CreateDatabase(){
sqlite3 *db;
char *err = 0;

int rc = sqlite3_open("battery_stats.sqlite", &db);

if(rc){
fprintf(stderr, "cant open database %s\n", sqlite3_errmsg(db));
}

char *sql = "CREATE TABLE IF NOT EXISTS stat(" \
"date DATETIME NOT NULL," \
"battery_percentage INT NOT NULL," \
"ac_connected INT NOT NULL);";

sqlite3_stmt *statement;
if(sqlite3_prepare_v2(db, sql, -1, &statement, 0) == SQLITE_OK){
sqlite3_step(statement);
}
sqlite3_close(db);

return db;
}

int _tmain(int argc, _TCHAR* argv[]){

//hide console window
HWND hWnd = GetConsoleWindow();
ShowWindow(hWnd, SW_HIDE);
int interval_time = ReadConfigurationFile();
sqlite3 *db = CreateDatabase();

while(true){
SYSTEM_POWER_STATUS status;
::GetSystemPowerStatus(&status);//get battery life
int per = status.BatteryLifePercent;
//get time
time_t rawtime;
struct tm* printtime;
time(&rawtime);
printtime = localtime(&rawtime);
string timenow = asctime(printtime);
//create structure to hold percentage as a string
char strPer[4];
//copy int percentage to string
itoa(per, strPer, 10);
//open textfile to write to
ofstream batteryStats;
batteryStats.open(GetHomeDirectory() +L"/battery_stats.txt", ios::app);
int AC = (int) status.ACLineStatus;
string acconnected;
if(AC){
acconnected = "AC connected";
}else{
acconnected = "AC disconnected";
}

if(!batteryStats.is_open()){
cout<< "Error";
}else{
batteryStats << "\n" << timenow < }

sqlite3_stmt *statement;
char *sql = "INSERT INTO stat (date, battery_percentage, ac_connected) VALUES(?, ?, ?);";
if(sqlite3_prepare_v2(db, sql, strlen(sql), &statement, 0) == SQLITE_OK){
sqlite3_bind_text(statement, 1, asctime(printtime),strlen(asctime(printtime)), 0);
sqlite3_bind_int(statement, 2, per);
sqlite3_bind_int(statement, 3, AC);

sqlite3_step(statement);
sqlite3_finalize(statement);
} else {
fprintf(stderr, "cant update database %s\n", sqlite3_errmsg(db));
}

batteryStats.close();
//sleep for specified time
Sleep(interval_time);
}
}

This was posted on Tue 24 Jun 2014 (4 years, 8 months ago) by Ryan McConville
A few years ago, when learning Python, I wrote a little script that will retrieve and save to disk the current bing.com background image. The directory where the image is saved is the first argument. You can run it via cron if you use a Linux based OS or via Windows Task Scheduler if you use, well, Windows.

I did notice that if you access bing.com from a USA IP address then you have the ability to retrieve a higher resolution 1920x1200 image, whereas from the UK (and likely elsewhere) you can only access the 1366x768 image.

import re
import urllib2
import os
import shutil
import sys

bing_url = 'http://www.bing.com'
save_location = sys.argv[1]

p = re.compile('/az/.+?(jpg)')
resp = urllib2.urlopen(bing_url).read()

image_url = p.search(resp).group()
if '_EN-US' in image_url:
res = image_url.split('_')
extension = image_url.split('.')[-1]
image_url = '_'.join(res[:-1]) + '_1920x1200.' + extension

try:
img = urllib2.urlopen(bing_url + image_url)
except urllib2.HTTPError as e:
print e
sys.exit(-1)

image_name = image_url.split('/')[-1]

if not os.path.exists(save_location):
os.makedirs(save_location)
with open(save_location + image_name, 'wb') as f:
shutil.copyfileobj(img, f)


Edit: I have since updated this script with some nice new changes. You can find that post here: https://ryanmcconville.com/blog/post/bing-image-retrieval-v2/
This was posted on Mon 23 Jun 2014 (4 years, 8 months ago) by Ryan McConville