Showing posts with label sql. Show all posts
Showing posts with label sql. Show all posts

Monday, April 20, 2009

SQL Database Point-In-Time Restore of Transaction Log Backup

A Teaching Assistant (TA) deleted all the grades in an assignment in our online learning management system, eLearning (powered by Blackboard Learning System CE). What happened was a series of actions that leaded to this. Here is what was described to me (I'm yet to be able to successfully simulate it):
The TA used the grade book tool to enter the grades of all students in a particular assignment (instead of using the assignment tool). He then changed the due date of that assignment to a later date. The system gave him a warning message, but he simply went ahead to proceed. He subsequently found that all the grades he entered for that assignment were gone.
Now I had to find a way to retrieve the grades from our old backups. But the challenge was he did not remember when (approximate date and time) he entered the grades, and when he modified the assignment properties which caused all the grades to be cleared.

I parsed our web server logs (using Cygwin's gawk) looking for all his POST requests with grade book and other strings in the URL. Yes! I saw the log entires dated April 14 around midnight, with the last update at 0:56am.

Therefore, I restored our elearning database backups (onto a test instance) to the April 14 0:57am state. Here is the SQL script.
restore database webctdatabase from DISK='D:\backupfolder\webctdatabase_backup_200904120005.bak' WITH NORECOVERY, REPLACE

restore database webctdatabase from DISK='D:\backupfolder\webctdatabase_backup_200904140005.dif' WITH NORECOVERY

restore log webctdatabase from DISK='D:\backupfolder\webctdatabase_backup_200904140250.trn' WITH RECOVERY, STOPAT = 'Apr 14, 2009 00:57:00 AM'

restore database webctdatabase$PF from DISK='D:\backupfolder\webctdatabase$PF_backup_200904120005.bak' WITH NORECOVERY, REPLACE

restore database webctdatabase$PF from DISK='D:\backupfolder\webctdatabase$PF_backup_200904140005.dif' WITH NORECOVERY

restore log webctdatabase$PF from DISK='D:\backupfolder\webctdatabase$PF_backup_200904140250.trn' WITH RECOVERY, STOPAT = 'Apr 14, 2009 00:57:00 AM'
A little background info: We ran eLearning full database backup once a week (Sunday), a differential backup nightly (Monday to Saturday), and transaction log backups throughout the day. Therefore, when restoring from backup, I needed to restore the full database backup (as of April 12 Sunday), the last differential backup (as of April 14 Tuesday), and the transaction log backup to the specific point-in-time.

After the restore was completed, I also had to execute the following SQL commands to set the webct and webct$PF users' SID on the test instance properly.
Use webctdatabase
go
sp_change_users_login 'report'

Use webctdatabase
go
sp_change_users_login 'update_one', 'webct', 'webct'

Use webctdatabase
go
sp_change_users_login 'update_one', 'webct$PF', 'webct$PF'


Use webctdatabase$PF
go
sp_change_users_login 'report'

Use webctdatabase$PF
go
sp_change_users_login 'update_one', 'webct', 'webct'

Use webctdatabase$PF
go
sp_change_users_login 'update_one', 'webct$PF', 'webct$PF'
Then I restarted the webct service on the frontend node of the test instance to bring up the application. I logoned to it and could see all the student grades in that section/assignment. We exported them for the TA.

Saved by the transaction log backups.

Friday, February 13, 2009

SQL Memory Paged Out During Large File Copy

We currently use SQL server maintenance plan to generate full database backup files (nightly or weekly, depending on the application), differential and transaction log backup files throughout the day. We then run a script to robocopy (from Windows Server 2003 Resource Kit Tools) the generated backup files offsite.

Ever since we migrated our databases from SQL server 32-bit to 64-bit, we started encountering a problem where SQL server process memory was being paged out. As a result, our applications became momentarily unresponsive and unavailable to our customers. This happened consistently around the same time every night, which coincided with the time when the offsite copying of SQL database full backup files was running. Error message in the SQL server log was:
A significant part of sql server process memory has been paged out. This may result in a performance degradation. Duration: 0 seconds. Working set (KB): xxxxxx, committed (KB): xxxxxx, memory utilization: xx%.
According to this Microsoft Article: How to reduce paging of buffer pool memory in the 64-bit version of SQL Server 2005, only SQL Server 2005 Enterprise Edition can use the Lock pages in memory to reduce paging. We are running SQL Server 2005 Standard Edition. We may need to upgrade/ migrate to Enterprise Edition. This means additional costs of licensing and migration to consider.

In the mean time, our applications are failing momentarily every night. I need to come up with a better approach now.

Our requirements are pretty simple (listed in the order of importance):
  • No paging of SQL server memory.
  • Can be automated via GUI, or script.
  • Still maintain a copy of database backup files locally. (This is preferable because in the event of a problem that requires a full database restore, the process would be much faster)
  • Fast.
Is that too much to ask for?

Google helped me locate this article Technet's Ask the Performance Team Blog: Slow Large File Copy Issue. It has a pretty good explanation of what is going on (the use of buffered I/O) during the copy process which leads to the paging problem we encountered.

I followed the recommendation in the article and tested using eseutil to perform offsite copying of our database backup files. Unfortunately, it did not work as expected. SQL server process memory was still being paged out.

I continued to investigate options. Here is a list of what I have looked into:
  • cat (from Cygwin)
  • split to remote share and cat to combine (from Cygwin)
  • split locally, copy to remote share, and cat to combine (from Cygwin)
  • rsync (from Cygwin)
  • Symantec Backup Exec backup agent
  • Symantec Backup Exec SQL backup agent
  • Windows NTBackup
  • SQL backup to remote share
  • SQL backup using MIRROR TO clause (Eliminated since it is available in SQL Server 2005 Enterpise Edition only)
  • SQL database mirroring and backup from mirrored databases (Eliminated since backup option is not available in the read-only mirrored databases)
  • SQL backup to a file server cluster volume. After database backup files are generated, failover to the other file server node for the offsite backup to run against (Not recommended due to additional complexity. We may better off migrate to SQL Server 2005 Enterprise Edition)
After some testings, the winner (or the best compromise) is rsync.

We continue to use SQL server maintenance plan to generate all database backup files. The files are kept locally on the SQL server. We then run a script to rsync the files offsite. It never pages out our SQL server process memory. It still takes a long time to run, but compare to other approaches, this is the fastest one.

Our largest database is the webctdatabase of our Blackboard LMS. It is now about 400GB, and it takes about 15 hours for rsync to copy it offsite.

Wednesday, February 11, 2009

SQL Injection Exploit... of a Mom



It's brilliant.

SQL Script to Generate a List of Sections (& Instructors) from Blackboard Learning System CE8

We need to generate a list of all courses/ sections hosted in our Learning Management System (powered by Blackboard Learning System CE8, formerly known as WebCT).

In Blackboard's terminology, all courses and sections are learning contexts. They are hierarchical and their parent-child relationships along with other information are stored in the database. So, the challenge here is to figure out how to write a recursive query.

With a little help from an msdn article: Recursive Queries Using Common Table Expressions, I figured out how to do that today.

The following SQL query will generate a list of all sections ordered by the course names, and the by section names:
with temp1 (parent_learning_context_id, learning_context_id, name, level)
as
(
select parent_learning_context_id, learning_context_id,
name, 0 as level from rpt_learning_context
where parent_learning_context_id IS NULL
union all
select LC.parent_learning_context_id,
LC.learning_context_id, LC.name, level + 1
from rpt_learning_context as LC
inner join temp1 as T
on LC.parent_learning_context_id = T.learning_context_id
)
select T.parent_learning_context_id, LC.name,
T.learning_context_id, T.name from temp1 as T
inner join learning_context as LC
on T.parent_learning_context_id = LC.id
where T.level = 4
order by LC.name, T.name
With a little twist, I can easily modify it to generate a list of all instructors teaching this Spring semester:
with temp1 (parent_learning_context_id, learning_context_id, name, level)
as
(
select parent_learning_context_id, learning_context_id,
name, 0 as level from rpt_learning_context
where parent_learning_context_id IS NULL
union all
select LC.parent_learning_context_id,
LC.learning_context_id, LC.name, level + 1
from rpt_learning_context as LC
inner join temp1 as T
on LC.parent_learning_context_id = T.learning_context_id
)
select LC.name, T.name, P.webct_id_lowercase,
P.name_n_given, P.name_n_family, P.email from temp1 as T
inner join learning_context as LC
on T.parent_learning_context_id = LC.id
inner join rpt_member as M
on M.learning_context_id = T.learning_context_id
inner join person as P
on M.person_id = p.id
where T.level = 4
and LC.name like '%2009_spring%'
and M.role='SINS'
and M.active = '1'
and M.denied_access = '0'
order by T.name
SQL queries - I'm loving it.

Thursday, February 5, 2009

SQL Script to Download Grades from Blackboard Learning System CE8

I just finished a SQL query to download all grade book data from Blackboard Learning System CE8.
select
LC2.name as Semester,
GB.learning_context_name as Section,
P.source_id as StudentID,
GB.user_login_name,
GB.given_name,
GB.family_name,
GB.column_name,
GB.column_type,
GB.original_value,
GB.override,
GB.max_points,
GB.final_value
from rpt_ext_gradebook as GB
inner join rpt_person as P
on P.person_id = GB.person_id
inner join rpt_learning_context as LC1
on GB.learning_context_id = LC1.learning_context_id
inner join rpt_learning_context as LC2
on LC1.parent_learning_context_id = LC2.learning_context_id
where P.demo_user = 0
order by LC2.name, GB.learning_context_name, column_name, user_login_name
This depends on the RPT_EXT_GRADEBOOK background job to run successfully to have the data populated properly.

I ran it against a clone of our production database. It took 1.5 minutes to retrieve 2.2 millions grade records. I spot checked a number of courses (sections). It included the resulting value of grades even when they were calculated by formula. It also gave me the original grades, any overrides made, and the final grades. Very nice!

Wednesday, October 29, 2008

SQL Injection Attacks by Country of Origin

Here is a world map of SQL injection attacks by their country of origin from January to October this year. The darker the color, the higher the number of hits or percentage.



Here is a pie chart representation of the information.


Monday, October 13, 2008

Wrapper for Microsoft SQL Injection Source Code Analyzer Tool

Microsoft has released a SQL Injection Source Code Analyzer for ASP code. Refer to KB 954476 for more information about the tool. However the analyzer can only check one ASP page at a time. It does not automatically recursive scan all ASP files in a folder. Therefore I wrote a simple wrapper around it. Here is the source code:
@echo off
setlocal
set source=\\servername\applicationname
set include=%source%\include
set logfile=check_applicationname.log
IF EXIST %logfile% DEL %logfile%
FOR /F "usebackq tokens=1 delims=?" %%i IN (`dir /S /B %source%\*.asp`) DO (
msscasi_asp.exe /NoLogo /GlobalAsaPath=%source% /input="%%i" /IncludePaths=%include% >> %logfile%
)
endlocal

Thursday, July 31, 2008

SQL Mirroring Manual Failover for eLearning (Blackboard Learning System CE8)

In my previous SQL Mirroring for eLearning post, I listed the detail on how to setup SQL database mirroring for Blackboard Learning System CE8. Since the application (Weblogic) does not support automatic failover, we need to have a procedure and scripts ready to perform manual failover in the event of a diaster (when the production database server is no longer available).

Here is how:
  • Stop WebCT service on all frontend nodes.
  • If you use asynchronous mirroring, you can execute the following commands on the mirror (business continutiy) server to force failover. The databases will then be in "In Recovery" status. It will take a minute for it to change to "Principal, Disconnected" status. The Disconnected status means that there is no mirrored session going on for this new principal server which is correct.
ALTER DATABASE webctdatabase SET PARTNER FORCE_SERVICE_ALLOW_DATA_LOSS

ALTER DATABASE webctdatabase$PF SET PARTNER FORCE_SERVICE_ALLOW_DATA_LOSS
  • If you use synchronous mirroring instead, execute the following commands to failover:
ALTER DATABASE webctdatabase SET PARTNER FAILOVER

ALTER DATABASE webctdatabase$PF SET PARTNER FAILOVER
  • After failover, you need to execute the following in the new principal server to set the webct and webct$PF users' SID.
Use webctdatabase
go
sp_change_users_login 'report'

Use webctdatabase
go
sp_change_users_login 'update_one', 'webct', 'webct'

Use webctdatabase
go
sp_change_users_login 'update_one', 'webct$PF', 'webct$PF'


Use webctdatabase$PF
go
sp_change_users_login 'report'

Use webctdatabase$PF
go
sp_change_users_login 'update_one', 'webct', 'webct'

Use webctdatabase$PF
go
sp_change_users_login 'update_one', 'webct$PF', 'webct$PF'
  • Then you will need to change the jdbc connection string on the admin node and all managed nodes to point the database to the new principal server.
cd \bea92\weblogic92\config\WebCTDomain\config\jdbc\

copy /y CP-WebCTConnectionPool$PF-jdbc.xml CP-WebCTConnectionPool$PF-jdbc.before_db_migration.xml
copy /y CP-WebCTConnectionPool-jdbc.xml CP-WebCTConnectionPool-jdbc.before_db_migraton.xml
copy /y CP-WebCTJMSConnectionPool-jdbc.xml CP-WebCTJMSConnectionPool-jdbc.before_db_migration.xml

cat CP-WebCTConnectionPool$PF-jdbc.before_db_migration.xml | sed 's/ProdServer/BusContServer/' > CP-WebCTConnectionPool$PF-jdbc.xml
cat CP-WebCTConnectionPool-jdbc.before_db_migraton.xml | sed 's/ProdServer/BusContServer/' > CP-WebCTConnectionPool-jdbc.xml
cat CP-WebCTJMSConnectionPool-jdbc.before_db_migration.xml | sed 's/ProdServer/BusContServer/' > CP-WebCTJMSConnectionPool-jdbc.xml

diff CP-WebCTConnectionPool$PF-jdbc.before_db_migration.xml CP-WebCTConnectionPool$PF-jdbc.xml
diff CP-WebCTConnectionPool-jdbc.before_db_migraton.xml CP-WebCTConnectionPool-jdbc.xml
diff CP-WebCTJMSConnectionPool-jdbc.before_db_migration.xml CP-WebCTJMSConnectionPool-jdbc.xml
  • Bring up the Blackboard CE8 cluster.

SQL Mirroring for eLearning (Blackboard Learning System CE8)

We finally have a set of servers in another building designated for business continuity of our eLearning Learning Management System (powered by Blackboard Learning System CE8). We have a procedure to failover to that location in the event of a disaster. We even ran a contingency drill back in May to verify the process.

One of the outstanding problems we have is : eLearning's database is so hugh (and is getting larger every day) that in the event of a disaster, it takes a very long time to restore it from backup. In order to minimize the potential down time, we decided to use SQL database mirroring to mirror eLearning's database from our current production to the business continuity server.

Here is how:
  • Backup the eLearning's databases on the principal (production) server and restore them to the mirror (business continuity) server using the NORECOVERY option to ensure the log files on both servers match exactly.
restore database webctdatabase from DISK='D:\folder\webctdatabase.bak' WITH NORECOVERY, REPLACE

restore database webctdatabase$PF from DISK='D:\folder\webctdatabase$PF.bak' WITH NORECOVERY, REPLACE
  • On both the principal (production) and mirror (business continuity) servers, create database mirroring partner endpoint. (TCP port 5022 is used in this example since that the the default port number number. But that can be changed to any TCP port number available).
CREATE ENDPOINT mirroring_end_point
STATE=STARTED
AS TCP(LISTENER_PORT=5022)
FOR DATABASE_MIRRORING (ROLE=PARTNER)
  • On the mirror server, specify the principal server as the mirror partner.
ALTER DATABASE webctdatabase SET PARTNER = 'TCP://BusContServerName:5022'

ALTER DATABASE webctdatabase$PF SET PARTNER = 'TCP://BusContServerName:5022'
  • On the principal server, specify the mirror server as the mirror partner.
ALTER DATABASE webctdatabase SET PARTNER = 'TCP://ProdServerName:5022'

ALTER DATABASE webctdatabase$PF SET PARTNER = 'TCP://ProdServerName:5022'
  • Now the principal server will start sending transactions to the mirror server.
  • Initially, the principal server and the mirror server will be in a synchronizing state. After all transactions from the principal server have been reapplied on the mirror server, their states will change to synchronized.
By default, a mirror session operates in synchronous mode. But you may prefer the session to be asychronous instead due to performance reason (e.g. high latency between the principal and mirror server). You can execute the following command on either the principal or the mirror server to change the mirror session to asychronous mode.
ALTER DATABASE webctdatabase SET PARTNER SAFETY OFF

ALTER DATABASE webctdatabase$PF SET PARTNER SAFETY OFF
Also worth noting that one can set up a SQL server as a witness server for automatic failover. But the application has to support it. In our case, Blackboard CE8 does not. Therefore, we do not pursue this path. We have a procedure, and a script to perform manual failover instead.