Monday, December 7, 2009

Radar Chart Gadget verified by Google - Yeah!

Google now requires all custom gadgets to be verified by them in order they can be viewed by collaborators. Refer to the Gadgets: Verifying Custom Gadget Google docs help page for more information.

I had developed a Radar Chart Google Spreadsheet Gadget a while ago. And our department have been using it to visualize the assessment data. So I better submit it to Google for verification as soon as possible.

It took about 3 business days... and YES! As of noon today, Google had verified my radar chart gadget. I hope it will be listed in their gadget gallery soon.

Internet Explorer Strikes Again

Just found out my Radar Chart Google Spreadsheet Gadget did not work on all popular browsers.  Guess which one?!

The error message was:
window.G_vmlCanvasManager is null or not an Object
I swear I had verified it working on Internet Explorer before.
I swear I had not made any change since.
and I swear...
and I swear...

Apparently, there was a problem loading excanvas.js (a javascript to enable HTML5 canvas for Internet Explorer).  As a result, window.G_vmlCanvasManager did not exist.

My original code was:
<content type="html"><![CDATA[
<!--[if IE]><script type="text/javascript" src=""></script><![endif]-->
<script src="" type="text/javascript"></script>
As it turns out, there needs to be something between <![CDATA[ and the conditional comment <!--[if IE]>... even if it is just a <p> or another <script> line.

So I changed the code to:
<Content type="html"><![CDATA[
<script src="" type="text/javascript"></script>
<!--[if IE]><script type="text/javascript" src=""></script><![endif]-->
The gadget is working once again.

Tuesday, May 12, 2009

Finished - Radar Chart Google Spreadsheet Gadget, That Is!

I believe I have finished the work on creating a Radar Chart Google Spreadsheet Gadget. Here are the changes since my previous release:
  • Added the ability to toggle individual records on and off in the chart. This allows better visual comparison between different records and/or the overall average.
  • Added the ability to toggle the competency line on and off. Also, if user does not input a competency value, the line would not be shown at all.
  • Added better error handling when any of the user inputs, spreadsheet data is not a number.
  • Added a process to determine the real maximum value for scales in radar chart.
  • Fixed the alignment problems of labels with various number of radar lines (columns).
  • Changed the gadget size to have a default height of 250px so it renders properly in iGoogle.
  • Changed the gadget size to use dynamic height for large records set.
  • Added a user configurable refresh time for the gadget, and default is 5 minutes.
  • If "Calculate average" is chosen, then the gadget will draw the radar chart of the overall average by default. Otherwise, it will draw the radar chart of all records.
  • Added a comment field.
This is what the current release looks like:

You can use this gadget with your own spreadsheet data. Here is how:
You can give me feedback via the comments section.

If it looks good, I will submit it to the Google Gadget Gallery in a couple of weeks.

Tuesday, May 5, 2009

My First Radar Chart Google Spreadsheet Gadget

The charts and gadgets in Google spreadsheet offer a number of interesting ways to help in visualizing our assessment data. Gadgets further offer a way to present the visualization information into a web page or one's iGoogle page. However, Google spreadsheet does not have a radar chart gadget by default which is an important piece in visualizing data harvested in our transformative assessment effort (currently known as the Harvesting Grade Book). For a brief background information, see my post on Using Google Chart API to Implement Harvesting Grade Book.

I started to look into ways to combine the Google Chart API, Spreadsheet API and Gadget API. While searching around, I found that Greg Schechter had made a spider web google spreadsheet gadget and it is listed in the Google Visualization API Gadget Gallery. Great!

But as it turns out, it does not provide all the functionalities we need, such as calculating the overall average, drawing a competency line, drawing the scales, adding a title to the chart and it also does not work in Internet Explorer at all. But it gives me, the first time developer of Google spreadsheet gadget, a good starting point. And I found myself busy with writing a radar chart google spreadsheet gadget these days.

The program logic is not difficult at all. But woe to the existence of Internet Explorer browser. I estimated spending 3 times more effort just to work around it's problems.

First, Internet Explorer does not support the HTML5 canvas tag for 2D command-based drawing. To bring the same functionality to it, I found the ExplorerCanvas javascript which I could include into the code.

Second, Internet Explorer (specifically the ExplorerCanvas script) does not support dynamic creation of canvas using innerHTML. To work around the problem, I need to use createElement and initElement instead.
if(navigator.appName == "Microsoft Internet Explorer")
var chartCanvas = document.createElement("canvas"); = "radarchart_canvas";
Third, Google gadgets run in browser quirks mode as according to Gadget Specifications. What it means to me is that the gadget renders differently in Internet Explorer. This article Quirks mode and strict mode provides more detail information. To work around the problem, I have to change the css accordingly.

Last (at least for now), Internet Explorer would not draw any arc or circle if the center is a floating point value. To fix the problem, I use parseInt to convert the floating point to integer.

This is the first beta release of my radar chart spreadsheet gadget.

If you want to test using this gadget with your own spreadsheet data, you can go to your Google spreadsheet to insert the gadget:
Next, I will work on changing the gadget to use checkbox in the selection of graphing data on the right. Then one can choose multiple data (radar area) to be displayed together. This allows easy comparison of one data set to another, or the overall average.

Please give me feedback via the comments section.

Monday, April 20, 2009

SQL Database Point-In-Time Restore of Transaction Log Backup

A Teaching Assistant (TA) deleted all the grades in an assignment in our online learning management system, eLearning (powered by Blackboard Learning System CE). What happened was a series of actions that leaded to this. Here is what was described to me (I'm yet to be able to successfully simulate it):
The TA used the grade book tool to enter the grades of all students in a particular assignment (instead of using the assignment tool). He then changed the due date of that assignment to a later date. The system gave him a warning message, but he simply went ahead to proceed. He subsequently found that all the grades he entered for that assignment were gone.
Now I had to find a way to retrieve the grades from our old backups. But the challenge was he did not remember when (approximate date and time) he entered the grades, and when he modified the assignment properties which caused all the grades to be cleared.

I parsed our web server logs (using Cygwin's gawk) looking for all his POST requests with grade book and other strings in the URL. Yes! I saw the log entires dated April 14 around midnight, with the last update at 0:56am.

Therefore, I restored our elearning database backups (onto a test instance) to the April 14 0:57am state. Here is the SQL script.
restore database webctdatabase from DISK='D:\backupfolder\webctdatabase_backup_200904120005.bak' WITH NORECOVERY, REPLACE

restore database webctdatabase from DISK='D:\backupfolder\webctdatabase_backup_200904140005.dif' WITH NORECOVERY

restore log webctdatabase from DISK='D:\backupfolder\webctdatabase_backup_200904140250.trn' WITH RECOVERY, STOPAT = 'Apr 14, 2009 00:57:00 AM'

restore database webctdatabase$PF from DISK='D:\backupfolder\webctdatabase$PF_backup_200904120005.bak' WITH NORECOVERY, REPLACE

restore database webctdatabase$PF from DISK='D:\backupfolder\webctdatabase$PF_backup_200904140005.dif' WITH NORECOVERY

restore log webctdatabase$PF from DISK='D:\backupfolder\webctdatabase$PF_backup_200904140250.trn' WITH RECOVERY, STOPAT = 'Apr 14, 2009 00:57:00 AM'
A little background info: We ran eLearning full database backup once a week (Sunday), a differential backup nightly (Monday to Saturday), and transaction log backups throughout the day. Therefore, when restoring from backup, I needed to restore the full database backup (as of April 12 Sunday), the last differential backup (as of April 14 Tuesday), and the transaction log backup to the specific point-in-time.

After the restore was completed, I also had to execute the following SQL commands to set the webct and webct$PF users' SID on the test instance properly.
Use webctdatabase
sp_change_users_login 'report'

Use webctdatabase
sp_change_users_login 'update_one', 'webct', 'webct'

Use webctdatabase
sp_change_users_login 'update_one', 'webct$PF', 'webct$PF'

Use webctdatabase$PF
sp_change_users_login 'report'

Use webctdatabase$PF
sp_change_users_login 'update_one', 'webct', 'webct'

Use webctdatabase$PF
sp_change_users_login 'update_one', 'webct$PF', 'webct$PF'
Then I restarted the webct service on the frontend node of the test instance to bring up the application. I logoned to it and could see all the student grades in that section/assignment. We exported them for the TA.

Saved by the transaction log backups.

Thursday, March 26, 2009

APC UPS Monitoring using SNMP

I just finished configuring IPSentry to monitor our APC UPS (via SNMP).

Currently, IPSentry will send email alerts to us when the UPS batteries need replacing, etc. But if the main power is out and the battery run time remaining is under certain threshold, it will send us alerts to our cell phones instead. We also tested using it to call a script which automate the shutdown of our servers and it works. We may decide to implement it eventually, but I feel we need more observations to verify the reliability of the monitoring before doing so (since that script is so powerful and any false alarm can cause our entire shop to be shutdown).

Anyway, here are the OIDs we are now monitoring:

Date Type
The elapsed time since the UPS has switched to battery power.timeticks
n mins = n*6000 timeticks
. remaining battery capacity expressed in percent of full capacity.gauge
. current internal UPS temperature expressed in Celsius.gauge
The UPS battery run time remaining before battery exhaustion.
n mins = n*6000 timeticks
Indicates whether the UPS batteries need replacing.
noBatteryNeedsReplacing (1)
batteryNeedsReplacing (2)

Tuesday, March 24, 2009

More eLearning User Penetration Numbers

In December 2007, I wrote a script to determine the user penetration rate of eLearning, our online Learning Management System (powered by Blackboard Learning System CE6). We have subsequently upgraded the system to CE8 with all the latest service packs. Fortunately, the script continues to work.

Here are the numbers for the main Pullman campus:

Semester Undergraduate Students Graduate Students Undergraduate + Graduate Students
Total LMS % Total LMS % Total LMS
2007 Fall 15441 12862 83.30 1981 749 37.81 17422 13611 78.13
2008 Spring 14383 12079 83.98 1874 747 39.86 16257 12826 78.90
2008 Fall 15571 14521 93.26 1998 1165 58.31 17569 15686 89.28
2009 Spring 14650 12968 88.52 1939 936 48.27 16589 13904 83.81

Friday, March 13, 2009

Using rsync on Windows

I posted an article on an issue of SQL Memory Paged Out During Large File Copy last month, which documented my effort in finding a way to solve the issue. Yes, rsync (from Cygwin) is still our final choice.

Today I'm going to post some tips on using rsync on Windows.

To install rsync, run the Cygwin setup.exe installer and install rsync package. After that, I would recommend adding Cygwin's bin folder into your environment path. This way, you can call rsync directly from the command line.
  • Right click on My Computer. Choose Properties.
  • Click on Advanced tab.
  • Click on Environment Variables button. (The Environment Variables window will be opened)
  • Under System variables, highlight Path. Click Edit.
  • At the end of the Variable value, append ;c:\cygwin\bin (or wherever your installation path is. Don't forget the semi-colon in front).
  • Click OK | OK | OK to finish.
Now you can use rsync to copy file in the command window, like the followings:
rsync.exe \\fromserver\share\filename d:\backupfolder\filename
But there is another issue. The destination file (copied via rsync) is owned by the user account running rsync. One may prefer to have the file permissions inherited from the parent folder instead. To fix that, I use the setacl utility from SourceForge.

The following commands first change the file ownership to local Administrators group of the backup server, clear the discretionary and security access control lists (DACL, SACL) of the file, and finally set the ACLs to be inherited from the parent folder.
setacl.exe -on filename -ot file -actn setowner -ownr "n:backupserver\Administrators;s:n"
setacl.exe -on filename -ot file -actn clear -clr dacl,sacl
setacl.exe -on filename -ot file -actn setprot -op "dacl:np;sacl:np"
Put them all into a batch file, and you have a simple script to perform nightly backup using rsync.

Friday, February 20, 2009

Web Slides on Harvesting Grade Book

Diigo has a feature where you can create a "List" of bookmarks. Inside the list, you can order the bookmarks in any sequence you want, and create a slide show from it.

I just created a series called The Evolution of Harvesting Grade Book. It tells the story behind some of the transformative assessment projects our department has been working on - visualized through what is currently called the "Harvesting Grade Book".

Friday, February 13, 2009

SQL Memory Paged Out During Large File Copy

We currently use SQL server maintenance plan to generate full database backup files (nightly or weekly, depending on the application), differential and transaction log backup files throughout the day. We then run a script to robocopy (from Windows Server 2003 Resource Kit Tools) the generated backup files offsite.

Ever since we migrated our databases from SQL server 32-bit to 64-bit, we started encountering a problem where SQL server process memory was being paged out. As a result, our applications became momentarily unresponsive and unavailable to our customers. This happened consistently around the same time every night, which coincided with the time when the offsite copying of SQL database full backup files was running. Error message in the SQL server log was:
A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 0 seconds. Working set (KB): xxxxxx, committed (KB): xxxxxx, memory utilization: xx%.
According to this Microsoft Article: How to reduce paging of buffer pool memory in the 64-bit version of SQL Server 2005, only SQL Server 2005 Enterprise Edition can use the Lock pages in memory to reduce paging. We are running SQL Server 2005 Standard Edition. We may need to upgrade/ migrate to Enterprise Edition. This means additional costs of licensing and migration to consider.

In the mean time, our applications are failing momentarily every night. I need to come up with a better approach now.

Our requirements are pretty simple (listed in the order of importance):
  • No paging of SQL server memory.
  • Can be automated via GUI, or script.
  • Still maintain a copy of database backup files locally. (This is preferable because in the event of a problem that requires a full database restore, the process would be much faster)
  • Fast.
Is that too much to ask for?

Google helped me locate this article Technet's Ask the Performance Team Blog: Slow Large File Copy Issue. It has a pretty good explanation of what is going on (the use of buffered I/O) during the copy process which leads to the paging problem we encountered.

I followed the recommendation in the article and tested using eseutil to perform offsite copying of our database backup files. Unfortunately, it did not work as expected. SQL server process memory was still being paged out.

I continued to investigate options. Here is a list of what I have looked into:
  • cat (from Cygwin)
  • split to remote share and cat to combine (from Cygwin)
  • split locally, copy to remote share, and cat to combine (from Cygwin)
  • rsync (from Cygwin)
  • Symantec Backup Exec backup agent
  • Symantec Backup Exec SQL backup agent
  • Windows NTBackup
  • SQL backup to remote share
  • SQL backup using MIRROR TO clause (Eliminated since it is available in SQL Server 2005 Enterpise Edition only)
  • SQL database mirroring and backup from mirrored databases (Eliminated since backup option is not available in the read-only mirrored databases)
  • SQL backup to a file server cluster volume. After database backup files are generated, failover to the other file server node for the offsite backup to run against (Not recommended due to additional complexity. We may better off migrate to SQL Server 2005 Enterprise Edition)
After some testings, the winner (or the best compromise) is rsync.

We continue to use SQL server maintenance plan to generate all database backup files. The files are kept locally on the SQL server. We then run a script to rsync the files offsite. It never pages out our SQL server process memory. It still takes a long time to run, but compare to other approaches, this is the fastest one.

Our largest database is the webctdatabase of our Blackboard LMS. It is now about 400GB, and it takes about 15 hours for rsync to copy it offsite.

Wednesday, February 11, 2009

SQL Injection Exploit... of a Mom

It's brilliant.

SQL Script to Generate a List of Sections (& Instructors) from Blackboard Learning System CE8

We need to generate a list of all courses/ sections hosted in our Learning Management System (powered by Blackboard Learning System CE8, formerly known as WebCT).

In Blackboard's terminology, all courses and sections are learning contexts. They are hierarchical and their parent-child relationships along with other information are stored in the database. So, the challenge here is to figure out how to write a recursive query.

With a little help from an msdn article: Recursive Queries Using Common Table Expressions, I figured out how to do that today.

The following SQL query will generate a list of all sections ordered by the course names, and the by section names:
with temp1 (parent_learning_context_id, learning_context_id, name, level)
select parent_learning_context_id, learning_context_id,
name, 0 as level from rpt_learning_context
where parent_learning_context_id IS NULL
union all
select LC.parent_learning_context_id,
LC.learning_context_id,, level + 1
from rpt_learning_context as LC
inner join temp1 as T
on LC.parent_learning_context_id = T.learning_context_id
select T.parent_learning_context_id,,
T.learning_context_id, from temp1 as T
inner join learning_context as LC
on T.parent_learning_context_id =
where T.level = 4
order by,
With a little twist, I can easily modify it to generate a list of all instructors teaching this Spring semester:
with temp1 (parent_learning_context_id, learning_context_id, name, level)
select parent_learning_context_id, learning_context_id,
name, 0 as level from rpt_learning_context
where parent_learning_context_id IS NULL
union all
select LC.parent_learning_context_id,
LC.learning_context_id,, level + 1
from rpt_learning_context as LC
inner join temp1 as T
on LC.parent_learning_context_id = T.learning_context_id
select,, P.webct_id_lowercase,
P.name_n_given, P.name_n_family, from temp1 as T
inner join learning_context as LC
on T.parent_learning_context_id =
inner join rpt_member as M
on M.learning_context_id = T.learning_context_id
inner join person as P
on M.person_id =
where T.level = 4
and like '%2009_spring%'
and M.role='SINS'
and = '1'
and M.denied_access = '0'
order by
SQL queries - I'm loving it.

Sunday, February 8, 2009

Using Google Chart API to Implement Harvesting Grade Book

Our department has been working on a transformative assessment approach to teaching and learning. One of the challenges of implementing the idea is to provide rich renderings to help in conceptualizing and communicating the assessment data, one of which is in a form of what we currently called the Harvesting Grade Book (See "Rich Assessment From A Harvesting Grade Book" for more information).

Currently, the graphs in the Harvesting Grade Book example are generated manually. The data is downloaded into Excel and then graphed. This manual approach though, has its limitation: the results cannot be scaled and cannot be updated in real-time. We need to find ways to automate this process.

I looked at options.  The Microsoft LogParser utility was a potential solution I investigated to address the scaling issue. For those who are not familiar with it, LogParser is powerful command line utility. It takes input from a variety of sources, such as CVS, query a SQL database and even Active Directory via LDAP, etc. The output can also be in a variety of formats, such as XML, charts, etc. I have used it to generate a number of monitoring and usage graphs for the Blackboard Learning Management System. I believe my familiarity with LogParser would enable me to apply the utility, in conjunction with a wrapper script, to automate graphs generation for the Harvesting Grade Book in batch.

In order to overcome the second limitation - real-time updates, I decided to experiment with Google Chart API. For those who are not familiar with it, the Google Chart API lets you dynamically generate chart on the fly by sending it a REST web services call (which is basically a URL which contains the chart properties and data). The service will then return you the chart as an image in PNG format. What makes this solution elegant and easy to apply is that the URL contains the data. This kind of web-based charting is far easier and efficent than the manual compilations and complications of using Excel.

For example, a radar graph like those in the Harvesting Grade Book example can be created with a URL:
Google will then return the following graph:
Perfect! And to any usage limitation concerns of using Google Chart API service, "There's no limit to the number of calls per day you can make to the Google Chart API... If you think your service will make more than 250,000 API calls per day, please let us know by mailing an estimate to ...". With a number like 250,000, I consider the scalability issue solved.

Thursday, February 5, 2009

SQL Script to Download Grades from Blackboard Learning System CE8

I just finished a SQL query to download all grade book data from Blackboard Learning System CE8.
select as Semester,
GB.learning_context_name as Section,
P.source_id as StudentID,
from rpt_ext_gradebook as GB
inner join rpt_person as P
on P.person_id = GB.person_id
inner join rpt_learning_context as LC1
on GB.learning_context_id = LC1.learning_context_id
inner join rpt_learning_context as LC2
on LC1.parent_learning_context_id = LC2.learning_context_id
where P.demo_user = 0
order by, GB.learning_context_name, column_name, user_login_name
This depends on the RPT_EXT_GRADEBOOK background job to run successfully to have the data populated properly.

I ran it against a clone of our production database. It took 1.5 minutes to retrieve 2.2 millions grade records. I spot checked a number of courses (sections). It included the resulting value of grades even when they were calculated by formula. It also gave me the original grades, any overrides made, and the final grades. Very nice!

Friday, January 9, 2009

Google Search Finds Missing Child

I find this BBC news Google search finds missing child interesting.  A nine-year-old girl, allegedly kidnapped by her grandmother, has been found using a mobile phone signal and Google Street View.

What a creative and brilliant way of using technology.