Thursday, August 30, 2007

Dashboard Insight - Editor's Note

A new magazine, sponsored by Dundas Software.

If we can tap into and leverage the collective consciousness of the business intelligence community, then we will achieve our goal of providing a forum for relevant thought and expression of ideas on the smart and effective use of dashboards. And we will have served a useful purpose for the business intelligence community.

Source: Dashboard Insight - Editor's Note

SQL Server Manageability Team Blog : Custom Reports in Management Studio

 

Implementation

Custom reports stored as report definition (.rdl) files are created by using Report Definition Language (RDL). RDL contains data retrieval and layout information for a report in an XML format. RDL is an open schema; developers can extend RDL with additional attributes and elements. Reports may execute any valid tsql statement within the report.

If Object Explorer is connected to a server, custom reports can execute in the context of the current Object Explorer selection if they reference that node's report parameters. This enables a report to use the current context, such as the current database, or a consistent context, such as specifying a designated database as part of the tsql statement that is contained in the custom report.

SQL Server Manageability Team Blog : Custom Reports in Management Studio

Aaron Bertrand : A custom report for Management Studio : Show Blocking

Quick and easy way to see blocked processes. 

So guess what I set out to do first? I created my own blocking report that includes the information I deem most useful to finding the root cause and stamping it out. Why should I spend all my time typing out sp_who2, and sp_lockinfo, and select * from sys.dm_exec_requests, and DBCC INPUTBUFFER, when I can create a report that does most of that work for me in a couple of clicks?

Based loosely on the procedure I created in my article, "Can I create a more robust and flexible version of sp_who2 using SQL Server 2005's DMVs?", I wrote the following stored procedure that would be consumed by a custom report:

Aaron Bertrand : A custom report for Management Studio : Show Blocking

Friday, August 24, 2007

Passed 70-446 BI Exam

 
After passing the required exams, I'm now a Microsoft Certified Technology Specialist in Microsoft SQL Server 2005 Business Intelligence (MCTS: SQL Server 2005 BI) which won't fit on my business cards but that is okay.

The last exam was brutally long (5 case studies, 9-10 questions per) and I will probably have nightmares tonight about Datum corp.

The funniest part was I got a perfect mark in the Data Mining category, and that was the one I was most concerned with. 

My studying strategy for this one was similar to 70-445, though I skimped on some of the research because of work commitments:

  1. Get the outline from MS's site.
  2. Export it to Excel and select the items I'm not familiar with.
  3. Bring them into One Note.
  4. Create tabs for each item to expand on.
  5. Cross them out as I get the research.
  6. Google each outline item with the category or "MSDN" or "Technet" to get more info.
  7. Search Blogsearch & Technorati.
  8. Search for the specific items I'm not familiar with using the same wording as the ms outline.
  9. Read the books.  (The Kimball Data Warehousing book was excellent)

The exam took about 2 1/2 hours.  At the very end of the exam, after I found out my mark, the application hung and wouldn't print the score.  We spent about 10 minutes trying to get it to print, and I came home with 10 copies of my result.

Here's the outline of the exam.

1.    Select appropriate BI technologies.
2.    Specify the appropriate SQL Server edition.
3.    Design dimensional models.
4.    Design dimensions for each subject area.
5.    Design fact tables for each subject area.
6.    Identify current dimensions that can be reused.
7.    Identify elements that must be added to existing dimensions or fact tables.
8.    Design new physical objects based on a logical model.
9.    Design an indexing strategy.
10. Design a surrogate key strategy.
11. Identify appropriate business keys.
12. Design a partitioning strategy.
13. Identify design constraints.
14. Identify changed data in the source system.
15. Decide the strategy for decoding textual values.
16. Decide whether to implement fast load.
17. Design appropriate destination components to handle new and updated records during incremental loads.
18. Identify appropriate transformations and transformation options.
19. Design data flow.
20. Identify appropriate control flow items.
21. Design the control flow sequence.
22. Identify appropriate uses and placement of event handlers.
23. Identify appropriate uses and placement of checkpoints.
24. Identify appropriate uses of logging.
25. Identify appropriate uses of data flow error handling.
26. Select appropriate uses of shared data sources.
27. Select appropriate uses of stored procedures or user-defined functions.
28. Define appropriate security roles.
29. Specify folder security.
30. Specify field-level security.
31. Select appropriate uses of Report Designer.
32. Select appropriate uses of Report Definition Language (RDL).
33. Select appropriate uses of Report Builder.
34. Decide appropriate uses of datasets.
35. Decide appropriate uses of subreports.
36. Decide the appropriate placement of extensive business logic.
37. Identify appropriate uses of report snapshots.
38. Identify appropriate uses of on-demand reports.
39. Identify appropriate uses of on-demand-from-cache reports.
40. Identify appropriate uses of standard subscriptions.
41. Identify appropriate uses of data-driven subscriptions.
42. Identify appropriate report-delivery methods for subscriptions.
43. Identify appropriate uses of the Reporting Services Web Service library.
44. Identify appropriate uses of the Reporting Services Configuration tool.
45. Select appropriate uses of named queries.
46. Select appropriate uses of named calculations.
47. Select appropriate uses of denormalization strategies.
48. Identify appropriate uses of attribute relationships.
49. Identify appropriate uses of column binding to support a user-defined reporting hierarchy.
50. Select a design for implementing a ragged hierarchy.
z51. Select an appropriate strategy to implement member properties.
52. Identify appropriate uses of calculated members.
53. Identify appropriate uses of actions.
54. Identify appropriate uses of key performance indicators (KPIs).
55. Identify appropriate uses of perspectives.
56. Identify appropriate uses of translations.
57. Identify appropriate uses of drillthrough.
58. Identify a relationship type.
59. Identify appropriate uses of role-playing dimensions.
60. Choose an appropriate strategy to handle unknown dimension members.
61. Define appropriate security roles.
62. Design dimension security.
63. Design cell security by using Multidimensional Expressions (MDX).
64. Design a partitioning strategy for optimal data availability.
65. Decide whether proactive caching is an appropriate solution.
66. Design partition storage settings.
67. Select a dimension storage mode.
68. Identify appropriate algorithms to meet requirements.
69. Classify data as input, key, predict, and ignore.
70. Select appropriate uses of SSRS Data Mining Extensions (DMX) queries.
71. Select appropriate uses of ActiveX Data Objects (Multidimensional) (ADOMD).
72. Select appropriate uses of SSIS Data Mining Query tasks.
73. Select appropriate uses of data mining viewer controls for Microsoft .NET Framework-based applications.
74. Select appropriate uses of full processing.
75. Select appropriate uses of structure processing.
76. Select appropriate uses of default processing.

The exam focused heavily on Integration Services, data model & data mining exhibits, with quite a few questions around transactions and security.

My next exam will probably be the PerformancePoint beta, or maybe Sharepoint... or ... Oracle?

Source: MCTS: SQL Server 2005 Business Intelligence

Thursday, August 23, 2007

Open Source Project Management Tools in C#

 

User Story.NET

This project is a tool for Extreme Programming projects in their User Story tracking.

Go To User Story.NET

Project Portal

Project Portal is a multi-lingual, multi-user web-based groupware suite for Programme & Project Management.

Go To Project Portal

SharpForge

SharpForge supports collaborative development and management of multiple software projects. Similar to SourceForge or CodePlex but for your own team or organisation. The software is written in C# for .NET 2.0, uses Subversion for source control and is released under the New BSD License. Home page: http://sharpforge.org/ Features: - Multi Portal - Multi Project - Subversion Administration - (planned)Work Item Tracking - (planned)Project Forums - (planned)Release Management - (planned)Subversion based content management - (planned)News Feed Aggregation Requirements: - Windows 2000 / WinXP + SP1 or Windows 2003. - IIS - .NET 2.0 - Sql Server Express SP1 - Apache 2.0.54 - Subversion 1.3

Go To SharpForge

Source: Open Source Project Management Tools in C#

Wednesday, August 22, 2007

Cool Text: Logo and Graphics Generator

 

Cool Text is a free graphics generator for web pages and anywhere else you need an impressive logo without a lot of design work. Simply choose what kind of image you would like. Then fill out a form and you'll have your own image created on the fly.

Source: Cool Text: Logo and Graphics Generator

Open Source Rules Engine for .NET

 

NxBRE -The open-source rule engine for the .NET

NxBRE is the open-source rule engine for the .NET platform and a lightweight Business Rules Engine that supports two different approaches:

  • The Inference Engine, which is a forward-chaining (data driven) deduction engine and that supports concepts like Facts, Queries and Implications and like Rule Priority, Mutual Exclusion and Precondition (as found in many commercial engines). It is designed in a way that encourages the separation of roles between the expert who designs the business rules and the programmer who binds them to the business objects.
  • The Flow Engine, which uses XML as a way to control process flow for an application in an external entity. It is basically a wrapper on C#, as it offers all its flow control commands (if/then/else, while, foreach), plus a context of business objects and results. It is a port of JxBRE to .NET's C#.

NxBRE is released under LGPL license in order to allow users to legally build commercial solutions that embed NxBRE.

Source: .Net Adventures

Tuesday, August 21, 2007

Microsoft Excel : CUBE Functions 3: Formula AutoComplete revisited

 

Formula Auto-Complete for CUBE function arguments
Formula AutoComplete for CUBE function arguments has some special behaviours relative to the other functions in Excel.   As a brief refresher, Formula AutoComplete is a feature that provides a list of values from which to choose as you write formulas … here is a blog post that describes this in detail.  In most Formula AutoComplete scenarios, Excel knows the list of values (formulas, named ranges, table names) that it should display because those values are part of the Excel application.  For example, when you start typing a function name, Excel can give you a list of all the other functions that start with the same character(s), as is shown in this screenshot.

Microsoft Excel : CUBE Functions 3: Formula AutoComplete revisited

Monday, August 20, 2007

The Helper Table Workbench

 

Sometimes, when writing TSQL code in functions or procedures, it
is tempting to do iterations, or even worse, a cursor, when it isn't
really necessary. Cursors and iterations are both renowned for slowing
down Transact SQL Code SQL Server just isn't designed for it.
However, there is usually a way to do such operations in a set-based
way. If you do so, then your routines will run a lot faster, with speed
at least doubling. There are a lot of tricks to turning a problem that
seems to require an iterative approach into a set-based operation, and
we wish we could claim we'd invented one of them. Probably the most
useful technique involves that apparently useless entity, the 'helper'
table. This workshop will concentrate on this, because it is probably
the most widely used.
The most common Helper table you'll see is a table with nothing but the
numbers in a sequence from 1 upwards. These tables have a surprising
number of uses. Once you've understood the principles behind helper
tables, then you'll think of many more. We'll be providing several
examples where a helper table suddenly makes life easier. The objective
is to show the principles so that you'll try out something similar the
next time you have to tackle a tricky operation in TSQL.

As always, you're encouraged to load the example script into Query
Analyser or Management Studio, and experiment!
Our examples include:
Splitting Strings into table-rows, based on a specified delimiter
Encoding and decoding a string
Substituting values into a string
Extracting individual words from a string into a table
Extracting all the numbers in a string into a table
Removing all text between delimiters
Scrabble score
Moving averages
Getting the 'Week beginning' date in a table
Calculating the number of working days between dates.

Source: The Helper Table Workbench

SQL Server 2005 DDL Trigger Workbench

 

How about automatically tracking and logging all database changes, 
including changes to tables, views, routines, queues and so on? With SQL
Server 2005 it isn't that hard, and we'll show how it is done. If you 
haven't got SQL Server 2005, then get SQL Server Express for free. It 
works on that! While we're about it, we'll show you how to track all
additions, changes and deletions of Logins and Database Users, using
a similar technique.

Source: SQL Server 2005 DDL Trigger Workbench

Robyn Page and Phil Factor

Some excellent articles on Reporting Services & SQL programming. 

Robyn Page is a consultant with Enformatica and USP Networks. She is also a well known actress, being most famous for her role as Katie Williams, barmaid in the Television Series Family Affairs.

Phil Factor (real name withheld to protect the guilty), aka Database Mole, has 20 years of experience with database-intensive applications. Despite having once been shouted at by a furious Bill Gates at an exhibition in the early 1980s, he has remained resolutely anonymous throughout his career.

Source: Robyn Page and Phil Factor

PDF from .NET, Reporting, BIRT, Telerik, Orcas, URL Data Services and Astoria! - Sam Gentile

Experiences from the front - adhoc reporting of the future. 

About this time, we both started to realize that what a lot of these reporting tools wanted was an "XML feed" if you will, particularly Actuate. It made much more sense to have it "point to" XML rather than tight-binding directly to database tables. Then we started to talk about what really needed was to have URI-addressable sources of XML formatted data.

That's when both cried "Astoria!" We finally get to use an Orcas thing! -) So we downloaded Astoria, fired up Orcas, and pretty much, without looking at any doc, were able to get a prototype working that used a REST-style interface over our SQL Server data returned via an Entity Framework model and URIs to point to it. I think, it's fair to say, that we were both quite impressed with Astoria's power and ease of programmability in this regard. It just seems so natural to me now that data and resources being surfaced through URIs and HTTP verbs.

Source: PDF from .NET, Reporting, BIRT, Telerik, Orcas, URL Data Services and Astoria! - Sam Gentile

Rapid Application Development: Referential Integrity - Data Modeling Mistake 1 of 10

Catch up - Lee is on number two. 

Referential Integrity - Data Modeling Mistake 1 of 10

Kick this article (a good thing) on DotNetKicks

In my mind data models are like the foundations of a house. Whether you use ORM or a more traditional modeling tool, they form the base of the entire rest of your project. Consequently, every decision you make (or don’t make) regarding your data model during the design phase(s) of your project will significantly affect the duration of your project and the maintainability and performance of your application.

Rapid Application Development: Referential Integrity - Data Modeling Mistake 1 of 10

Saturday, August 18, 2007

Lucian's weblog : Retrieve data from Wikipedia using C#

This could be useful, since Wikipedia is now the source of all information in the world, along with Google & Facebook! 

To get other pages, you can simply use the direct link. The link is composed (as you’ve maybe already noticed) from http://en.wikipedia.org/wiki/Special:Export/ followed by the name of the data you want to retrieve. So, if you want data about William Shakespeare, the direct link to the XML file will be http://en.wikipedia.org/wiki/Special:Export/William_Shakespeare.

Knowing all this, it’s now simple to write a program to deal with the set of data provided by Wikipedia. The program looks something like this (and I’ll only give you the important part of the code, I’m sure you know where to put it):

private void button1_Click(object sender, EventArgs e)

{

System.Net.HttpWebRequest webRequest = (HttpWebRequest)System.Net.WebRequest.Create("http://en.wikipedia.org/wiki/Special:Export/William_Shakespeare");

webRequest.Credentials = System.Net.CredentialCache.DefaultCredentials;

webRequest.Accept = "text/xml";

try

{

System.Net.HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse();

System.IO.Stream responseStream = webResponse.GetResponseStream();

System.Xml.XmlReader reader = new XmlTextReader(responseStream);

String NS = "http://www.mediawiki.org/xml/export-0.3/";

XPathDocument doc = new XPathDocument(reader);

reader.Close();

webResponse.Close();

XPathNavigator myXPahtNavigator = doc.CreateNavigator();

XPathNodeIterator nodesText = myXPahtNavigator.SelectDescendants("text", NS, false);

while (nodesText.MoveNext())

textBox1.Text = nodesText.Current.InnerXml + " ";

}

catch (Exception ex)

{

textBox1.Text = ex.ToString();

}

}

Lucian's weblog : Retrieve data from Wikipedia using C#

More on What is Astoria

On to "What is Astoria?"

Astoria is the cloudiest city in North America. Astoria is also the codename for an incubation project started some months ago attempting to answer the following questions: if you could provide a dead-simple way of programming against a relational data store that resides on the internet, what should the programming model look like? Could it be simpler than SOAP-based data access programming?

Microsoft Codename "Astoria": Data Services for the Web - Alex Barnett blog

Pablo Castro's blog - death of a database

Pablo offers a great distinction of a data service vs. a database. 

"...in a service you'll want the schema to be optimized for its target use, so semantics tend to drive it. In a database schema will be about data organization and performance."

A service is a different beast in this sense as well. In a service you just put the data "up there". The service will choose the appropriate physical organization for the data, regardless of the visible service schema. (for example, in the Astoria experimental online service you describe your data as entities and associations and the system figures out a logical/physical schema to support it, along with a mapping to translate between them). The system supporting the service may or may not use a relational database, or even a database at all (there are many large-scale storage systems that use other models instead of traditional relational to avoid the impact of global metadata and the complexity of partitioning highly structured schemas for scale-out).

Pablo Castro's blog

Project Astoria Team Blog : Transparency in the design process

Excellent article around the design processes one team at Microsoft uses. 

How transparent is transparent? I want to be completely clear about the scope of the information we are sharing. One of the things we need to learn both from the Microsoft side and from the community side is whether the model works within a practical set of restrictions. We would post as much of our discussions as it is practically possible. However, we have to make sure we don’t compromise the interests of Microsoft as a company. There are certain things that can range from ideas to specific implementation details that we could consider trade secrets, high-value Microsoft intellectual property or something along those lines. It *will* happen that in some cases we will not discuss a topic publicly, either for a certain term (e.g. until a proper IP protection mechanism is in place) or until we ship or ever. This is nothing new, but I haven’t seen folks from large companies discuss this explicitly before, so I wanted to make sure it is clear here.

Project Astoria Team Blog : Transparency in the design process

Friday, August 17, 2007

Passed 70-445 BI Exam

 When the beta exams for Business Intelligence showed up in January, I immediately signed up, wrote both of them, and found out after 4 months that I failed both. :)  Considering I didn't have any time to study (I was on a heavy client engagement), didn't have a lot of knowledge around the data mining aspects of Analysis Services, whipped through the questions at light speed, and ate a big lunch beforehand, I was more surprised that I didn't fall asleep than failing.  Plus, since they were beta, I'm sure some of my answers were correct and it was the questions that were wrong!

Today, after putting a bit of effort in, I passed 70-445, the implementation & maintenance exam with flying colors.  The betas were brutally long - this one didn't seem half as bad, though as usual the test computer was slow and there was a bunch of outside distractions & banging ductwork in the room. 

My studying strategy:

  1. Get the outline from MS's site.
  2. Export it to Excel and select the items I'm not familiar with.
  3. Bring them into One Note.
  4. Create tabs for each item to expand on.
  5. Cross them out as I get the research.
  6. Google 70-445, 70-445 filetype:ppt (pdf, doc, etc.) to get things like course outlines & demo exams.
  7. Search Blogsearch & Technorati.
  8. Search for the specific items I'm not familiar with using the same wording as the ms outline.
  9. Read the books.

My pretesting strategy:

  1. Always write the exam on a Friday at 1:15pm (a happy day!)
  2. Have a light lunch.
  3. Have a coffee and a quick jog around the block.
  4. Show up a bit early.

My testing strategy: 

  1. Don't spend more than 1 -2 minutes per question, mark it for later. 
  2. When in doubt, pick the answer that matches the text of the question, or is the opposite of the question, or C.
  3. Watch for the yahoo question writers who stick in those trick questions involving percentage calculations or not logic.

Here's one statistic that blew me away.  I didn't realize that BI was such a niche.  There have been only 71 MCITP + BI certifications since June.  Perhaps because the exams are only a couple of months old?  Next will be the BI Requirements & Design exam 70-446.  Sounds like case studies to me....

The Microsoft Certified Professionals (MCP) program was established in 1992. To date, more than 2 million people have achieved Microsoft Certification worldwide.

Based on customer feedback, we have updated this list to include both the credential and the number of professionals on each release or certification. Because the MCP program is constantly growing, you may not see your most recent certification listed. We will add accurate data as it becomes publicly available. These numbers are current as of June 27, 2007.

Source: Number of Microsoft Certified Professionals Worldwide

Shailan - Data for the Masses : Online Training for Performance Point Server

 

Online training courses available fro Performance Point Server

Comprehensive online training can be accessed through the links below. Microsoft Office Live Meeting must be installed before viewing any of the training sessions. If Live Meeting is not installed at the time you register for a training event, the system will direct you to a location where the application can be downloaded.

To gain access to the training sessions, click the following links. After you navigate to a site, you must register for the event; the system will send you an e-mail message with a link to the training session, which you then view in Microsoft Windows Media Player.

Monitoring

Analytics

Source: Shailan - Data for the Masses : Online Training for Performance Point Server

Tuesday, August 14, 2007

SQL Server Best Practices

More on the site... 

Technical White Papers

Deep level technical papers on specific SQL Server topics that were tested and validated by SQL Development.


SQL Server Best Practices Toolbox

Top 10 Lists

Summary lists (usually consisting of 10 items) of recommendations, best practices and common issues for specific customer scenarios by the SQL Server Customer Advisory Team.


Best Practices in SQL Server Books Online

Source: SQL Server Best Practices

Microsoft SharePoint Products and Technologies Team Blog : Microsoft SQL Server 2005 SP2 Reporting Services integration with WSS 3.0 and MOSS 2007

 

 

SQL Server 2005 Service Pack 2 (SP2), which will be released any minute now :-), enables deep integration between Reporting Services and SharePoint technologies (Windows SharePoint Services 3.0 and Office SharePoint Server 2007). This integration enables an end-user to view and manage reports completely from within a SharePoint environment. The following is an excerpt from the upcoming 2007 Microsoft Office System Business Intelligence Integration with SQL Server 2005 whitepaper. [Update March 17: Added the hyperlink for the whitepaper, which is 1.5 MB in size and in Word 2007 .DOCX format.]

Source: Microsoft SharePoint Products and Technologies Team Blog : Microsoft SQL Server 2005 SP2 Reporting Services integration with WSS 3.0 and MOSS 2007

Monday, August 13, 2007

SQL Server 2008 Improvement Pillars

I'm in the middle of a webchat on SQL 2008 and this chart looks useful for tracking it's progress...

The chart (linked)below depicts significant improvements coming online with each CTP. Below, you will see the 10 new improvements in the July CTP. Click on any improvement in the chart below to explore and learn more about it.  Want to find out more about the improvement groups?  Each group will open to an informative video on what each means.

Link to SQL Server 2008 Improvement Pillars

Sunday, August 12, 2007

OneNote Web Exporter - Home

 

Project Description
Plug-in to export your OneNote 2007 notebooks as an interactive web site.

Source: OneNote Web Exporter - Home

Coming out of the cloud

 

Could this be MS's answer to Amazon EC2?

Microsoft SoftGrid® Application Virtualization

Microsoft SoftGrid® Application Virtualization is the only virtualization solution on the market to deliver applications that are never installed, yet securely follow users anywhere, on demand. It dramatically improves IT efficiencies, enables much greater business agility and a superior end-user desktop experience.

Source: Microsoft SoftGrid® Application Virtualization

Run IT on a Virtual Hard Disk

How soon before MS releases it's own Amazon-style Cloud initiative with VPCs?

 

Using the power of virtualization, you can now quickly evaluate Microsoft and partner solutions through a series of pre-configured Virtual Hard Disks (VHDs). You can download the VHDs and evaluate them for free in your own environment without the need for dedicated servers or complex installations. Start now by selecting a lab from the VHD catalog below.

Source: Run IT on a Virtual Hard Disk

Sharing OneNote

 The example is from a BI course, however it can apply to any onenote sharing system.

1. Create a folder on your C drive called Ranger.

2. Create a folder under the C:\Ranger for: Notes, Labs (for the lab instructions), and Sessions

3. Under the notes folder create a folder with your name

4. Create a OneNote 2007 notebook in that folder.

5. In the OneNote2007 folder, create a section per day (Day 1, Day 2...Day 20)

6. Create a page under each day for each session.

7. Then "print to OneNote" or copy the Presentation for each session in the session page.

8. Write your notes on each page along with the presentation. (THERE IS NO WAY YOU WILL RETAIN ALL YOU ARE TAUGHT WITHOUT THESE STUDY AIDS).

9. Under the Sessions folder create a folder per session and copy in the content from the Ranger SharePoint site for that session. (This gives you a logical place to put presentations, links, whitepapers and other content you find on the web).

10. Use FolderShare (man, I love this tool), or Groove, and setup a sync with your study group so you are all in sync and you can leverage each other's notes and content.

Source: ...more notes from the field

Useful links from Dan's blog

 

SQL ISV blog: http://blogs.msdn.com/sqlisv

SQL CAT Blog: http://blogs.msdn.com/sqlcat

Best Practices Site: http://www.microsoft.com/technet/prodtechnol/sql/bestpractice/default.mspx

SQL Server AlwaysOn Partner program: http://www.microsoft.com/sql/alwayson/default.mspx

SQL Server Urban Legends: http://blogs.msdn.com/psssql/archive/2007/02/21/sql-server-urban-legends-discussed.aspx

SQL Server 2000 I/O Basics whitepaper: http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/sqlIObasics.mspx

Disk Subsystem Performance Analysis for Windows: http://www.microsoft.com/whdc/device/storage/subsys_perf.mspx

Microsoft Windows Clustering: Storage Area Networks: http://www.microsoft.com/windowsserver2003/techinfo/overview/san.mspx

Windows Server System Storage Home: http://www.microsoft.com/windowsserversystem/storage/default.mspx

Microsoft Storage Technologies – Multipath I/O: http://www.microsoft.com/windowsserversystem/storage/technologies/mpio/default.mspx

Windows 2003 Storport Driver: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/storage/hh/storage/portdg_88563b10-e292-48f3-92de-65bf90530455.xml.asp

Virtual Device Interface Specification: http://www.microsoft.com/downloads/details.aspx?FamilyID=416f8a51-65a3-4e8e-a4c8-adfe15e850fc&DisplayLang=en

SQL Server Consolidation on the 64-Bit Platform: http://www.microsoft.com/technet/prodtechnol/sql/2000/deploy/64bitconsolidation.mspx

SQL Server Consolidation on the 32-Bit Platform using a Clustered Environment: http://www.microsoft.com/technet/prodtechnol/sql/2000/deploy/32bitconsolidation.mspx

INF: Support for Network Database Files: http://support.microsoft.com/default.aspx?scid=kb;en-us;304261

824190 Troubleshooting Storage Area Network (SAN) Issues: http://support.microsoft.com/?id=824190

304415 Support for Multiple Clusters Attached to the Same SAN Device: http://support.microsoft.com/?id=304415

280297 How to Configure Volume Mount Points on a Clustered Server: http://support.microsoft.com/?id=280297

819546 INF: SQL Server support for mounted volumes: http://support.microsoft.com/?id=819546

304736 How to Extend the Partition of a Cluster Shared Disk: http://support.microsoft.com/?id=304736

325590 How to Use Diskpart.exe to Extend a Data Volume: http://support.microsoft.com/?id=325590

Updated Books Online for SQL Server 2005: http://www.microsoft.com/technet/prodtechnol/sql/2005/downloads/books.mspx

Shared Scalable Database: http://support.microsoft.com/?kbid=910378

Sunil gave us all we need to know and more on the TEMPDB. Check out the whitepaper he references: http://www.microsoft.com/technet/prodtechnol/sql/2005/workingwithtempdb.mspx

In addition, do not forget to review Kalen Delaney's book as she goes into detail on the TempDB in her book on the Database Storage Engine.

Source: ...more notes from the field

Acropolis Parts, Views, and xaml files. - MSDN Forums

 

Web Parts for Windows?  Here's Acropolis. 

Acropolis provides a mechanism which allows you to cleanly separate your business logic from any associated UI. The Part component contains the business logic, while the View contains the user interface for the part. Both of these are technically components in that they exist as separate pieces of functionality and in that you can replace either one of them as long as you implement the right interfaces.

But the Part component is somewhat different in that it is the main component that gets composed into the application. You can have Parts that have zero, one or more Views, but you can't have a View without an underlying part. If you did want some UI but didn't want to split out the business logic into a separate component, you would build a standard control.

This separation between business logic and UI exists all the way up to the main application itself. You can think of the entire application as a hierarchy of Parts and Views. The Application.xaml file defines the root level (i.e. application level) business logic for the application. The Window1.xaml file defines the UI for the application (i.e. the application's main window or Shell). I have seen people rename Window1.xaml to Shell.xaml but the Application.xaml file usually keeps that name. You can't really delete them because you need something to contain the other Parts and Views in the application.

Source: Acropolis Parts, Views, and xaml files. - MSDN Forums

Wednesday, August 08, 2007

Why every program should be like One Note

 Two things every program should have - Instant on/instant off automatic load/save and unlimited undo/history.

A useful feature for complex sites would be the ability to click to view the path you took through a web site (history) and recreate it the next time you're there (Journal).

Solution: Archy never loses your work. This shouldn't be a groundbreaking innovation in computer design, but it is. You never have to save because it's done for you automatically. Your data is stored in such a way that if your computer crashes, your information will still be there the next time you start Archy up.

You don't have to worry if you make a mistake. The UNDO command can reverse your steps all the way back to the first thing you ever did on Archy. Quitting Archy has no effect on UNDO's elephantine memory.

Source: Core Principles - Raskin Center

Tuesday, August 07, 2007

Conceptual Design Ideas

 Interface Concepts provides a good summary of conceptual design concepts:

The process of conceptual design involves a set of steps for translating requirements into a user interface design. The process begins by getting at the core of an application--the central concept--and proceeds by organizing the functionality from the users' point of view. Along the way, a deeper understanding of users and their requirements is developed. The result is an outline or model of the user interface that may be further developed during the detailed user interface design phase.

The conceptual design process involves the following steps:

  1. Define a central concept.
    A concise statement of what the application is and what it is not. It clearly defines the boundaries of the application and characterizes the overall users' view of the application.
  2. Describe user roles and their requirements.
    A list of who the target users are, what their roles are in their use of the application, and what is important to them--such as getting work done quickly, being in charge, and feeling successful.
  3. Define and prioritize measurable objectives and constraints.
    Objectives for the user interface are the designers' intentions such as:
    • to reduce repetitive tasks,
    • to have users feel in control, and
    • to provide satisfactory feedback on results.

    Operational definitions specify how the design will be measured against the objectives, for example, in usability testing. Constraints define the design space within which the objectives may be achieved, such as display resolution, response time, and availability of a pointing device.
  4. Design the user's object model.
    A table of all objects the user needs along with their attributes, actions, contents, and relationships among objects.
  5. Design the user's task model.
    A list of all tasks the user needs to perform with procedures on how to perform each task using the application.
  6. Synthesize a user interface model.
    The user interface model organizes the functionality according to the object and task models. This is a rough outline of the user interface that guides the detailed design phase.
  7. Evaluate results against the objectives.
    Various evaluation methods, such as heuristic evaluation and usability testing are selected to measure how well the objectives have been achieved..

Source: Interface Concepts

Thursday, August 02, 2007

Free SharePoint Web Parts (3rd Party) - The Boiler Room - Mark Kruger, Microsoft SharePoint MVP

 

Free SharePoint Web Parts (3rd Party)

Free SharePoint Web Parts (3rd Party)

Source: Free SharePoint Web Parts (3rd Party) - The Boiler Room - Mark Kruger, Microsoft SharePoint MVP

Great set of SharePoint Web Parts - Eli Robillard's World of Blog.

 

Great set of SharePoint Web Parts

UGS contains one of the nicest web part collections I've seen:

Source: Great set of SharePoint Web Parts - Eli Robillard's World of Blog.