Thursday, August 29, 2013

Calling Server Side Function From JavaScript in ASP.Net


In this article we will see how to call a server-side function from JavaScript in ASP.Net. Many developers, when they work on a project, get similar requirements in their projects. To do that you can use AJAX to communicate with the server-side method. I demonstrate the client-side of using AJAX in the following code.
I will now discuss the server-side function and how to call it in JavaScript. 
Server Code (.cs file)
In the following code we create a method named "Name" and his return type is string.
using System;using System.Collections.Generic;using System.Linq;using System.Web;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.Services;
 
public
partial class Default2 : System.Web.UI.Page{
    protected void Page_Load(object sender, EventArgs e)
    {
    }

    [WebMethod]
    public static string Name()
    {
        string Name = "Hello Rohatash Kumar";
        return Name;
    }
}

The preceding server-side code shows the attribute [WebMethod] and Declaration (static).
[WebMethod]
To make use of this "[WebMethod]" attribute you need to use the "System.Web.Services" namespace as shown in the preceding server-side code. The attach the "[WebMethod]" attribute to a server-side method to make the method callable from remote Web clients. To learn More

Static (Declaration)

A static method is callable on a class even when no instance of the class has been created. If any instances of the class are created then they cannot be used to access the static member. Only one copy of static fields and events exist. This static method can be accessed directly by the name of the static method followed by the "." (dot) operator and the class name. So that's why I marked the method as static. It cannot interact with the instance properties and methods of your Page class, because a Page method call creates no instance of the Page or any of its controls.
Client Code( .aspx code)
The client side code has the following control and properties on the page.
ScriptManager Control
The EnablePageMethods property of the ScriptManager Control indicates whether public static page methods in an ASP.NET page can be called from a client script. The EnablePageMethods attribute must be added on the ScriptManager tag as true.
Button Control
Drag and drop a Button control onto the form to show the message when you click on the Button.
OnClientClick Method
The "OnClientClick" event will be associated with the JavaScript function.
<%@ Page Language="C#" AutoEventWireup="true" CodeFile="Default2.aspx.cs" Inherits="Default2" %>

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
    <title></title>
    <script type='text/javascript'>
        function GetName() {

            PageMethods.Name(Success, Failure);
        }
        function Success(result) {
            alert(result);
        }
        function Failure(error) {
            alert(error);
        }
    </script>
</head>
<body>
    <form id="form1" runat="server">
    <asp:ScriptManager ID='ScriptManager1' runat='server' EnablePageMethods='true' />
    <div>  
        <asp:Button ID="Button1" runat="server" Text="Get Name" OnClientClick='GetName();return false;' />
    </div>
    </form>
</body>
</html>

The code above defines how to call a server-side function named "Name". 
If there are no errors then "Success" will show a pop-up window with our server-side message text.
function Success(result) {
            alert(result);
        }
 
Else, the pop-up will show an exception message.
function Failure(error) {
            alert(error);
        }

Now press F5 to run the application.
Calling-Server-Side-Function-from-JavaScript-1.jpg
Now click on the Button control to see the server-side code message.
Calling-Server-Side-Function-from-JavaScript-2.jpg

InfoQ eMag: The Best of Mobile Development



Download the Mobile Development eMag to learn about strategies for developing native, HTML5, and hybrid mobile apps, the state of cross-platform mobile tools, and the future of mobile development.

Free download

Download this book FREE (PDF)

Contents of the Mobile Development eMag include:

  • Building Hybrid Mobile Apps with ASP.NET MVC Daniel Jebaraj demonstrates creating hybrid mobile apps with Android and ASP.NET MVC.
  • Bijan Vaez on HTML5 Learn why the EventMobi conference mobile app was built with HTML5 instead of native technologies, the challenges and advantages, and much more.
  • InfoQ Research Results: What are the Most Important and Mature Cross-Platform Mobile Tools? InfoQ examined the importance and adoption level of a range of cross platform mobile tools that aim to help developers deliver applications on a variety of mobile platforms. The results have been made available in this eMag.  
  • Mobile, HTML5 and the Cross-Platform Promise Maximiliano Firtman discusses the status of HTML5 and its browser support, introducing cross-platform app creation for the store, including hybrid apps with PhoneGap.

About InfoQ eMags

InfoQ eMags are professionally designed, downloadable collections of popular InfoQ content - articles, interviews, presentations, and research-covering the latest software development technologies, trends, and topics.

InfoQ eMag: The Best of NoSQL



The InfoQ NoSQL eMag brings together a selection of popular NoSQL articles recently published on InfoQ.com.  Hand selected and curated by the InfoQ editorial team, this professionally designed eMag will help you get up to speed on some of the key underlying concepts and trends driving NoSQL adoption.

Free download

Download this book FREE (PDF)

Contents of this eMag include:

  • The State of NoSQL According to Dr. Stefan Edlich, NoSQL is here to stay. In this article he reviews the current state of NoSQL adoption,  the increasing demand for NoSQL development skills,  and recent startups that have received funding, along with a number of other key indicators. He looks at what the leaders in the NoSQL space are doing - as well as the newcomers - and concludes with a number of observations:
    • Eric Brewer’s commentary on the CAP Theorem, that you can only choose "2 of 3" - or the fact that one cannot have full availability and consistency in the presence of partitions - is misleading
    • The "SQL community" is reacting to NoSQL,  with a number of "NewSQL" databases emerging in the market
    • Prematurely switching to a new technology can be costly,  but early adopters often benefit in the future
  • Introduction to MongoDB for Java, PHP and Python Developers Rick Hightower makes an argument for NoSQL databases, explaining why they matter and why they are useful. He continues with an introduction to MongoDB, explaining its shortcomings, tradeoffs and main concepts, and comparing basic operations with corresponding SQL statements. He also enters deeper waters addressing replica sets and sharding. The last section of the article is dedicate to practical guidance for setting up and start using MongoDB for Python, PHP and Java developers.
  • CAP Twelve Years Later: How the “Rules” Have Changed Dr. Eric Brewer, the author of CAP Theorem, attempts to clarify some of the misunderstandings surrounding the theorem, especially the idea that one has to choose 2 of 3 CAP properties. A more correct restatement of the theorem is that one cannot have 100% consistency and availability in the presence of partitions. Brewer also offers advice in handling partitions in order to achieve both consistency and availability. A system’s architect should.
  • NoSQL: Past, Present, Future In this session held at QCon San Francisco 2012, Prof. Dr. Eric Brewer takes a look at the history of, NoSQL, tracing its origins in the ’60s with the first pointer-based database. He notes how developments in data storage and management took two different paths, one following the ACID prerequisites and the other building on BASE. Brewer also takes note of the NoSQL developments in, the last decade and the role played by the CAP Theorem he authored.

Free E-Book: TypeScript for C# Programmers




Due to the popularity of the open web, JavaScript is becoming an essential language and since 2009 it has been running on servers too thanks to NodeJS. The problem is that due to JavaScript's dynamic type system, it is hard to create great tooling around the language such as sensible auto-completion, refactoring support, type-checking and modularisation.
TypeScript is an open source lanaguage from Microsoft that solves this problem by introducing an optional type system and class-based object-orientation, which make great tooling for large applications possible.
TypeScript let's you write JavaScript that is robust enough for the enterprise and that can run in any browser, on any host and on any operating system.

Free download

Download this book FREE (PDF)

Tabel of Contents

  • Compiling or Transpiling

  • Language Features

    • TypeScript Files
    • Types
    • Modules, Classes and Interfaces
    • Functions
    • Enumerations
    • Generics
  • Structural Typing

  • Access Modifiers

  • Memory Management

    • Releasing Resources
  • Exceptions

  • Arrays

  • Dates

    • Now
    • Date Methods
  • Events

    • Mouse Events
    • Keyboard Events
    • Object Events
    • Form Events
    • Custom Events
    • Running Order
  • Framework

  • Creating Definitions

    • Dynamic Declarations
    • Type Declarations
  • Useful Tricks

    • Obtaining Runtime Types
    • Extending Native Objects

Improved Debugging with Visual Studio 2013




Visual Studio 2013 includes several diagnostics features which enables developers to effectively debug their applications. It provides support for analyzing .NET memory dumps that enables you to investigate memory issues on your servers. Moreover, you will be able to debug async operations for C++, JavaScript and .NET languages. According to official sources, this feature requires installation of Visual Studio 2013 on Windows 8.1 and works with Windows Store, desktop/server and ASP.NET web apps.
Visual Studio 2013 provides support for Just My Code for C++, which automatically hides code that is not classified as written by the original developer. However, the external code can be displayed by right clicking on the Call Stack window. It also provides support for CPU debugging on Windows 7 and improved Profile Guide Optimization (PGO) process through the CPU Sampling profiler.

The IDE also provides an ability to debug JavaScript and C++ interop codes including the long pending demand of support for 64-bit Edit and Continue for .NET, viewing of function return values in the Autos window. It also includes support to filter, sort and categorize code in the Static Code Analysis window.
There is something to cheer for C++ based Windows Store app developers as they can take advantage of improved exception reporting in addition to advanced customization options for native visualization files. Moreover, developers will be able to attach the debugger to running JavaScript code without having to enable the script debugging tool. Daniel Moth, Program Manager, Visual Studio has provided references to //build/ sessions related to the diagnostic tools.

In addition to the above debugging enhancements, Visual Studio 2013 also includes performance and diagnostics hub that ships with tools for measuring energy consumption, XAML & HTML UI responsiveness, JavaScript Memory including CPU sampling, JavaScript function timing and performance wizard tool.

Wednesday, August 28, 2013

Free E-Book: Defensive Database Programming



Defensive Database Programming by Alex Kuznetsov

Editorial Review

Resilient T-SQL code is code that is designed to last, and to be safely reused by others. The goal of defensive database programming, the goal of this book, is to help you to produce resilient T-SQL code that robustly and gracefully handles cases of unintended use, and is resilient to common changes to the database environment.
Too often as developers, we stop work as soon as our code passes a few basic tests to confirm that it produces the 'right result' in a given use case. We do not stop to consider what other possible ways in which the code might be used in the future, or how our code will respond to common changes to the database environment, such as a change in the database language setting, or a change to the nullability of a table column, and so on.
In the short-term, this approach is attractive; we get things done faster. However, if our code is designed to be used for more than just a few months, then it is very likely that such changes can and will occur, and the inevitable result is broken code or, even worse, code that silently starts to behave differently, or produce different results. When this happens, the integrity of our data is threatened, as is the validity of the reports on which critical business decisions are often based. At this point, months or years later, and long after the original developer has left, begins the painstaking process of troubleshooting and fixing the problem.
Would it not be easier to prevent all this troubleshooting from happening? Would it not be better to spend a little more time and effort during original development, to save considerably more time on troubleshooting, bug fixing, retesting, and redeploying?
This is what defensive programming is all about: we learn what can go wrong with our code, and we proactively apply this knowledge during development. This book is filled with practical, realistic examples of the sorts of problems that beset database programs, including:
  • Changes in database objects, such as tables, constraints, columns, and stored procedures
  • Changes to concurrency and isolation levels
  • Upgrades to new versions of SQL Server
  • Changes in requirements
  • Code reuse
  • Problems causing loss of data integrity
  • Problems with error handling in T-SQL
In each case, it demonstrates approaches that will help you understand and enforce (or eliminate) the assumptions on which your solution is based, and to improve its robustness. Ultimately, the book teaches you how to think and develop defensively, and how to proactively identify and eliminate potential vulnerabilities in T-SQL code.

About the author

Alex Kuznetsov has been working with databases for more than a decade. Currently he leads database development and optimizes database performance for a trading firm.

Book Details

Paperback: 200 pages
Publisher: Red Gate Books

Buy from Amazon.com
Buy from Amazon.co.uk
Free PDF download

How to hide SQL Server user databases in SQL Server Management Studio




Problem

I have a SQL Server instance that has hundreds of databases.  Navigating the database tree in SSMS is a pain and I was wondering if there was a way to limit the list of databases that I see in SSMS?

Solution

SQL Server consolidation is becoming more popular these days to reduce costs and therefore more and more databases are being put on one instance. It is very common to host multiple databases on a consolidated instance from multiple applications and departments and sometimes application owners want to hide their databases to other users of the instance. They do not want to make their database visible to others. This tip will give you an understanding on how databases can be hidden.

Setup

Suppose there are two databases A and B from two different applications and they are hosted on the same SQL Server instance. The users of database A are not allowed to see database B and vice versa. Here we will create two different logins user_A and user_B and give them appropriate rights to their own databases.

CREATE DATABASE A
GO
CREATE DATABASE B
GO
CREATE LOGIN user_A with password='U$er_A@1234'
Go
CREATE LOGIN user_B with password='U$er_B@1234'
Go
USE A
GO
CREATE USER user_A for login user_A;
GO
EXEC sp_addrolemember 'db_owner', 'user_A'
GO
USE B
GO
CREATE USER user_B for login user_B
GO
EXEC sp_addrolemember 'db_owner', 'user_B'

NOTE:-DO NOT MAKE CHANGES IN PRODUCTION WITHOUT PROPER TESTINGS IN LOWER-LIFE CYCLE ENVIRNOMENTS

Hiding all user databases for all logins

Suppose you want to hide all databases for all logins. Generally we hide our databases for security purposes. We can run the below statements to hide all databases for all logins. The databases will then only be visible to sysadmin logins or owners of the database.

USE MASTER
GO
DENY VIEW ANY DATABASE TO PUBLIC
GO

Once you run the above statement, you will not be able to see any databases in SQL Server Management Studio unless you are a sysadmin or your login is the owner of a database(s).
Here you can see in the below screen shot, I have connected using logins user_A and user_B and none of the user databases are showing after running the Deny View access to public.

Conneting to users for which DENY view is enabled

 

Only sysadmins and database owners can see databases 

To allow the logins to see their databases, I will make both logins the owners for their respective databases. User_A will be owner of database A and user_B will be the owner of database B. Run the below statements to change the database owners.

USE A
GO
SP_changedbowner [USER_A]
GO
USE B
GO
SP_changedbowner [USER_B]

We can check the database owners by running sp_helpdb. As you can see in the below screenshot that the database owners have been changed for both databases.

Change DB Owner of databases

Now we can connect to the SQL Server instance again using both logins and see the changes compared to before.  Here we can see that only one database is visible for both logins. Database A is visible to user_A and database B is visible to user_B. This is because both logins are now the database owners of these databases.

Output of hiding database after changing database owner

 

Does making a user a db_owner work

Now we will create a new login user_C and assign db_owner access to both databases and check whether these databases are visible to this new login.

CREATE LOGIN user_C with password='U$er_c@13'
GO
USE A
GO
CREATE USER user_C for login user_C;
GO
EXEC sp_addrolemember 'db_owner', 'user_C'
GO
USE B
GO
CREATE USER user_c for login user_C
GO
EXEC sp_addrolemember 'db_owner', 'user_C'

As we can see below, neither of these databases are visible for login user_C.  So from this we can see that you have to be the database owner to be able to see the databases in SQL Server Management Studio if the DENY VIEW ANY DATABASE is enabled for public.

View for a newly created login with db_owner on both databases

 

Steps to hide databases for a specific login

Suppose we don't want to do this across the board, but only do this for a specific login.  We can run the below statement instead of DENY VIEW ANY DATABASE TO PUBLIC. After running the below statement, this login won't be able to see databases except for any database that this login is the database owner, but all other logins can see the database as long as you did not also deny view to Public.

USE MASTER
GO
GRANT VIEW ANY DATABASE TO PUBLIC; -- turn this back on if it was off
GO
DENY VIEW ANY DATABASE TO USER_A;
GO

 

Steps to view all databases

By default, the VIEW ANY DATABASE permission is granted to the public role. Therefore, by default, every user that connects to an instance of SQL Server can see all databases in the instance. To grant the VIEW ANY DATABASE permission to a specific login or to all logins run the following query:

--To grant the VIEW ANY DATABASE permission to a specific login.
USE MASTER
GO
GRANT VIEW ANY DATABASE TO [login_name]; 
GO
--To grant the VIEW ANY DATABASE permission to public.
USE MASTER
GO
GRANT VIEW ANY DATABASE TO PUBLIC; 
Go

Note that if you use the DENY VIEW to PUBLIC this overrides the setting for an individual login, so if you DENY VIEW to PUBLIC and GRANT VIEW to a specific login this login will still not be able to see the databases. 
If you are using DENY VIEW to PUBLIC and you want a login to still be able to see all databases without making that login a sysadmin you can do the following.  Make the login a user in the master database and make that user a db_owner of the master database.  This is not a very good option from a security perspective, but this does work.  This way a login can see all databases without having to be a sysadmin.

Stairway to PowerPivot and DAX - Level 1: Getting Started with PowerPivot and DAX




The Series

Data Analysis Expressions (DAX) is the standard PowerPivot formula language that supports custom calculations in PowerPivot tables and Excel PivotTables. While many of the functions used in Excel are included, DAX also offers additional functions for carrying out dynamic aggregation and other operations with your data. Bill's new series serves as a progressive introduction to PowerPivot and DAX formulas, examining the functions, operators and values involved, and offering a wealth of practical examples.
DAX, which stands for Data Analysis Expressions, comprises the PowerPivot formula language that supports custom calculations in PowerPivot tables and Excel PivotTables. While many of the functions used in Excel are included within the language, DAX also offers additional functions for carrying out dynamic aggregation and other operations with your data.  DAX was introduced with PowerPivot, a free add-in for Excel 2010 that promises true “BI for the masses.”  PowerPivot holds much promise for analysts, reporting specialists and other information consumers, particularly those who are “at home” with Excel, and who serve the enterprise outside traditional IT and development roles. 
PowerPivot offers those who adopt it the capability of creating BI solutions rapidly, and often without the need for the resources required to construct traditional BI solutions.  Moreover, my experience has shown that, even in environments where traditional solutions (including everything from ETL mechanisms, underlying star-schema data sources, Analysis Services cubes, and the languages required to support all these components) are desired, PowerPivot can be used effectively to make the design, development and testing of the ultimate solution more efficient and reliable.
The PowerPivot / DAX combination insulates Excel users from many of the complexities of other components of the integrated Microsoft BI solution.  The DAX functions are intuitive, in large part, for those that work routinely with existing Excel functions, and enable knowledgeable users to rapidly extract and present the information needed to support enterprise decision makers in a timely, reliable manner.  In working with columns and tables (even very large ones) in relational data sources, we can enjoy high-speed lookups and calculations via an in-memory engine that DAX is designed to leverage in an optimal manner
In this series, our primary objective is to become comfortable with creating useful queries, within a business context, using the PowerPivot / DAX combination. As a means of achieving our objective, we will gain hands-on exposure to using PowerPivot in general, while becoming familiar with the DAX language, using formulas we construct from the DAX functions we introduce within the levels of this Stairways series. For each function we introduce, we will discuss what it is designed to produce and its operation in doing so, the syntax with which it is employed, and the data that it retrieves and presents.  The objective will be to encourage you to take the example code as you read the ‘steps’, and try it out, making changes and experimenting, so as to get the feel for the capabilities of PowerPivot  and DAX. We’ll illustrate every point with practice examples of business needs that we address with practical syntax. Try them out.
To get the most out of the series, you need to have installed either 64- or 32-bit Excel 2010, matched with the respective version of PowerPivot for Excel 2010PowerPivot for Excel 2010 is, at this writing, a free add-in that is available for download at www.powerpivot.com.  For the lion’s share of our practice exercises, which focus upon the Microsoft BI stack, we will use the Adventure Works relational OLTP and Data Warehouse database samples, as well as the Adventure Works Analysis Services database (and the Adventure Works cube it contains) sample, the installation of which we discuss in the section below.
You will also need the appropriate access rights to the sample data sources provided for SQL Server 2008R2.  Installation of the Standard edition of SQL Server 2008R2 will be adequate for the vast majority of our activities, although the Developer / Enterprise edition is certainly ideal.  I will provide references for step-by-step installation of SQL Server 2008R2 Developer / Enterprise in the section that follows, and the vast majority of the images presented in this series will reflect those environments.
It is also assumed that the computer(s) involved meet the system requirements, including hardware and operating systems, of the applications I have mentioned.
Important Note:  If you have no alternative except to work with SQL Server 2005 or 2008, the practice exercises of this series can perhaps be meaningfully completed with modification of the queries to compensate for differences in the data structures of the sample cube among the versions – although you may find this requirement cumbersome and distracting.  Because both the relational databases and the Analysis Services samples for 2008R2 differ somewhat from those of previous releases (a good example of this is that the Analysis Services date dimension, as well as the supporting relational data, has been advanced into later operating years of the Adventure Works organization), the sample formula syntax that we construct together will, when executed, deliver results which may differ between 2008R2 data sources and those of the previous releases.  While you can adjust your own steps to make up for these differences (perhaps by “checking your answers” independently), you will not have the added comfort of the “instant corroboration” available in simply comparing your results to those presented in the images and explanations I present in the exercises. 
Consider working, therefore, with 2008R2, if at all possible:  learning the basics of a new language is typically challenging enough for most that are new to it, without the additional distractions imposed by working with older releases.
Additional Note: The screen captures in this series, unless otherwise noted, are made from a Windows 7 or Windows Server 2008 R2 environment, so what you see on your own machine may differ, somewhat, if you are working within another environment.

Installing PowerPivot, Analysis Services 2008R2 and Samples

To install PowerPivot, simply perform the steps as outlined at the download site (www.powerpivot.com, at this writing), on a PC with Excel 2010 installed.
SQL Server 2008 and 2008R2 each provide a virtually identical single Setup program from which you can install any or all of its components, including Analysis Services. Using the unified Setup, you can install Analysis Services with or without other SQL Server components on a single computer.  It is important, however, to understand that Analysis Services relies upon other components of SQL Server: for example, the Adventure Works cube (which resides within the Adventure Works DW 2008R2 Analysis Services database), uses the AdventureWorksDW2008R2 relational data mart in SQL Server as its data source, so, if we want to process the Analysis Services database and its cube (we cannot query an unprocessed cube), we will need to have access to its underlying relational data source and, therefore, to SQL Server and the associated sample database.
There are many possible considerations in the installation of Analysis Services, depending upon the version(s) you intend to install, the hardware in your local environment, applications you may already have in place, and so forth. Rather than trying to reproduce them all in this article, we provide the following link, which covers this subject thoroughly, yet efficiently.
Considerations for Installing Analysis Services
(http://technet.microsoft.com/en-us/library/ms143708.aspx)
Once you have determined the components you need to install, you can follow step-by-step instructions on how to start Setup, and to select the components you want to install, by following this link:
Quick-Start Installation of SQL Server 2008
(http://technet.microsoft.com/en-us/library/bb500433.aspx)
Once you have successfully installed Analysis Services (along with any other components you have chosen from the Setup program), you are ready to download and install the samples that we will be working with in this series.  A great summary of the options that are available (based upon your SQL Server version and other considerations) can be found at:
How to install Adventure Works SQL DW and Analysis Services 2005/2008 sample database and project
http://www.ssas-info.com/analysis-services-faq/29-mgmt/242-how-install-adventure-works-dw-database-analysis-services-2005-sample-database

Getting Started with PowerPivot

Once we’ve got the PowerPivot for Excel 2010 add-in, SQL Server 2008R2 and the samples noted above installed, we’re ready to get started with PowerPivot.  Opening PowerPivot by taking the steps below will put us in position to begin working with DAX formulas.  An open PowerPivot window, the point at which we will be at the end of this section, will become the beginning point of each of the levels of this series.
  1. From the Start menu, select Microsoft Excel 2010.
  2. Above the Excel ribbon, click the PowerPivot tab, as shown.
Illustration 1:  Click the PowerPivot Tab …
The PowerPivot ribbon appears.
  1. Click the PowerPivot Window button, appearing at the left of the newly appearing PowerPivot toolbar, as shown.
Illustration 2:  The PowerPivot Ribbon Appears
The PowerPivot window, containing its own ribbon, opens atop the existing Excel spreadsheet.  We can tell to which workbook the window is linked because it assumes the name of the workbook as part of its own name, as depicted below.
Illustration 3:  The PowerPivot Window, Associated with the Workbook, Appears
It is in the PowerPivot window that we load and prepare the data with which we will be working (or will continue working, with data already added to the workbook). We will typically build a relational model here.  As we’ll see shortly, the PowerPivot window displays the tables on individual, tabbed sheets, and is the central place where we import tables, create relationships, maintain column data types and formats, and view, as needed, the data that underlies our data model.
Next, we’ll designate a source from which to import data.
  1. Click the From Database button on the PowerPivot window.
  2. Select From SQL Server on the drop-down menu that appears next.
Illustration 4:  Select “From SQL Server” in the From Database Dropdown
The Table Import Wizard dialog opens next. 
  1. In the top input box, titled Friendly connection name, type (or copy and paste) the following:
AdventureWorksDW2008R2
  1. Click the selector (downward pointing arrow) on the right side of the box titled Server name.
PowerPivot begins a scan of the machine to detect, and return to the selector, the available server choices.
  1. Select the appropriate server for your local environment, or type in the server name / localhost, as appropriate.
  2. Enter the authentication as required for the local environment (ideally selecting Use Windows Authentication).
  3. Select AdventureWorksDW2008R2 using the dropdown selector to the right of the box titled, Database name, at the bottom of the dialog.
The Table Import Wizard dialog, with our input, appears similar to that depicted below.
Illustration 5:  The Table Import Wizard Dialog, with Our Input
  1. Click the Test Connection button underneath the Database name selector.
A message box appears, indicating that the test connection has succeeded.
Illustration 6:  “Test Connection Succeeded”
  1. Click OK to dismiss the message box.
  2. Click the Next button at the bottom of the Table Import Wizard dialog.
  3. Leave the radio button on the dialog that appears next, labeled Choose How to Import the Data, at its default of Select from a list of tables and views to choose the data to import, as shown.
Illustration 7:  Choose “Select from a list of tables…” Option for Data Import
  1. Click the Next button.
PowerPivot loads the tables and views from the AdventureWorksDW2008R2 database into the Select Tables and Views dialog that appears next.
  1. Select the following tables,  by clicking the checkbox to the immediate left of the respective  table listing in the dialog:
  • DimAccount
  • DimCurrency
  • DimCustomer
  • DimDate
  • DimGeography
  • DimProduct
  • DimPromotion
  • DimSalesReason
  • DimSalesTerritory
  • FactInternetSales
  • FactInternetSalesReason
  1. Click the Select Related Tables button next.
The Select Tables and Views dialog appears, as partially shown, with our selection and related tables checked.
Illustration 8:  Selecting Tables (Partial View) …
The idea in these immediate steps is to import enough information into our model to allow us to do some illustrative analysis surrounding the business activities conducted by our hypothetical client, the Adventure Works organization.
  1. Click the Finish button.
The import process runs, and then we see a “Success” message, complete with a Details pane that indicates population by our choices.
Illustration 9:  Success is Indicated, Along with a List of Tables Imported
  1. Click Close.
The Table Import Wizard is dismissed, and we arrive at the PowerPivot window once again, where we see the imported data as partially depicted.
Illustration 10:  Imported Data, with a Tab for Each Table
Note that a tab for each imported table has been created in the PowerPivot window (the tabs appear in the bottom left of the window as seen above).  What we are now seeing is not an Excel table, but a view of the efficiently compressed columnar database that PowerPivot uses to store imported tables in memory.
The central pane contains the data and looks very similar to – but is not - an Excel table inside a worksheet. Keep in mind that PowerPivot tables and Excel tables are completely different objects: PowerPivot uses far less memory than the Excel tables with which we have become familiar, due to its superior data compression capability.  The database generated by PowerPivot saves space and supports highly efficient querying. 

Getting Started with DAX

We learned within the introduction to this level that PowerPivot for Excel 2010 offers the DAX language as a means for extending our capabilities with PowerPivot.  Let’s get some hands-on exposure to some basic DAX expressions as a way of getting up to speed and positioning ourselves to learn about individual functions in the levels of this series.
We’ll get started with some simple exercises.  To begin, let’s assume that we have a need to create a combined Product code / name label, primarily for report parameter picklist support and report labeling, but perhaps for other uses, as well, over time.  (We will use the ProductAlternateKey for this purpose, as the ProductKey in the table is a surrogate key, and not the actual product identifier.) This simple concatenation will serve as a great starting point in our introduction to DAX formulas and expressions.
  1. Click the tab named DimProduct in the PowerPivot window.
  2. Scroll over to, and click, the header of the far right column on the tab, which is labeled Add Column.
The Formula bar (fx) is activated, upon selection of the column.
  1. Type (or copy and paste)  the following into the Formula bar:
= [ProductAlternateKey]  & " - " &  [EnglishProductName]
NOTE:  We can also use the AutoComplete feature to save typing and / or avoid typographical or syntax errors.
We can use simple column names in the above syntax because the columns reside in the same table as our calculated column.  (We have to affix the table name if the column resides elsewhere, as we’ll see in later exercises.)
  1. Press the ENTER key.
The affected area of the DimProduct tab of the PowerPivot window appears, with our new addition, as shown.
Illustration 11:  The Newly Populated Calculated Column Appears …
We note that, upon creation of the calculated column, a new column is created to its immediate right, again with “Add Column” as its label, for easy addition of the next calculated column.  Our new calculated column, containing the new combined Product label, is named CalculatedColumn1 by default.  Let’s modify this to something more meaningful.
  1. Double click the column header of the newly added calculated column, currently containing the CalculatedColumn1 label.
  2. Type (or copy and paste)  the following into the column heading:
ProductLabel
  1. Press the ENTER key.
The column header now displays the desired title.
Illustration 12:  The New Title Appears …
In our example above, we used the “&” operator to concatenate two columns, together with a text string in between.  As we will see throughout the levels of Stairway to DAX, DAX provides many operators, which, with only a few exceptions, perform just like they do in Microsoft Excel.  These operators include those shown in Table 1.
Operator
How Employed

Arithmetic Class

+
Addition
-
Subtraction (or negation)
*
Multiplication
/
Division
^
Exponentiation

Parenthetic Class

()
Grouping / Precedence

 Comparison Class

=
Equal to
Less than
<=
Less than or equal to
Greater than
>=
Greater than or equal to
<> 
Not equal to

Text Concatenation Class

&
Concatenation

Logic Operator Class

&&
And
||
Or
!
Not / negation

Table 1:  DAX Operators
We will become familiar with these operators as we work through the levels of our series.
Next, as a prelude to moving into the core focus of our series, we’ll introduce DAX formulas and functions.  We will then be ready to examine our first function. 

DAX Formulas and Functions

We can exploit DAX formulas to create either calculated columns or measures. Let’s briefly introduce each of these “destinations” before kicking off the examination of our first function.  We define calculated columns in the PowerPivot window, as we discovered in our last section.  By contrast, we create measures within an Excel worksheet, where one or more pivot tables / charts reside which are supported by a PowerPivot table.  The difference in where calculated columns and calculated measures are created is important to understand. 
We create a calculated column from a PowerPivot window, either like we did above, by moving to and clicking the column with the header labeled “Add Column,” or by clicking the Add button in the Columns group within the Design ribbon.  (We will create calculated columns via the Add button later.)  Using either option, we input the DAX formula into the formula bar (by typing it, using Autosense, or a combination of both), as we did earlier, and then press ENTER. The PowerPivot Field List for any pre-existing pivot tables will notify us of changes in the underlying PowerPivot window, enabling a Refresh button we can use to update both the field list and the pivot table.
A measure is created from within an Excel workbook, within which a PivotTable / Pivot report must already exist.  Once we have focus within the Pivot Table / report, or within the corresponding PowerPivot Field List, we can either right-click one of the tables in the field list or choose Add New Measure.  Either action launches the Measure Settings dialog, as we shall see, into whose Formula text box we can input a desired DAX formula.  We will get plenty of hands-on exposure to creating measures, as well as opportunities to learn about the importance of measure context, as we move through the levels of our series.  For now, it’s enough to understand that we can use DAX formulas to create calculated columns and measures, and the basic differences between the two.
Next, let’s take a look at our first DAX function.

The RELATED Function

According to the Microsoft DAX Function Reference for PowerPivot, the RELATED() function “returns a related value from another table.”
The source column (“<column>”) is placed within the parentheses to the right of the word “RELATED,” as shown below:
RELATED(<column>)
The <column> placeholder represents the column containing the values that we wish to retrieve via the function.  A single value related to the current row is returned.  RELATED() requires that a relationship is in place between the current table (from which we are using the function) and the table with the related data.  Using RELATED() we specify the column containing the desired data, and RELATED() traces the relevant relationship (many-to-one) to retrieve the value from the column we specify in the related table.
RELATED() performs a lookup, therefore, based upon the relationship in place, and examines all the values within the table we specify (ignoring any filters we have put in place, if applicable).  A relationship must exist, of course, for the function to work.  If there is no relationship, we have to create one to be able to use the function.  (We work with relationships in other steps of our series, so we will become familiar with the straightforward process at a later time.)  RELATED() works much like the Excel VLOOKUP() function, but is significantly more flexible in various ways.  We will encounter various uses of RELATED(), often in conjunction with other DAX functions, throughout this series, where we will get a feel for the natural hierarchies it renders and so forth.
RELATED() can be used in a calculated column expression (the way we’ll work with it in this level), where the current row context is meaningful (we’ll talk about row context shortly).  RELATED() can also be employed as a nested function within an expression using a table scanning function, which gets the current row value and then scans another table for instances of that value.  (SUMX(), which we will examine early in the steps of this series, is one example of a popular table scanning function).  RELATED() allows us to leverage relationships easily and transparently – and in a way that the information consumer is insulated from the underlying database components.
Let’s take a look at an example of the DAX RELATED() function at work within the AdventureWorksDW2008R2 sample we’ve established.
  1. Click the Design tab atop the PowerPivot window.
Illustration 13:  The Design Tab atop the PowerPivot Window
  1. Click the Manage Relationships button on the Design tab, as shown.
Illustration 14:  The Design Tab atop the PowerPivot Window
The Manage Relationships dialog opens, and we see various relationships between tables that were defined within PowerPivot when we clicked the Select Related Tables button in our PowerPivot setup steps.
  1.  Click the fifth row from the top in the table list of the dialog, labeled DimProduct [ProductSubcategoryKey], as depicted, to select it.
Illustration 15:  Our Selection in the Manage Relationships Dialog
  1. Click the Edit button atop the dialog next.
The Edit Relationship dialog opens, and we are presented with the definition of the relationship between the DimProduct and DimProductSubcategory tables, as shown.
Illustration 16:  The Edit Relationship Dialog …
The Edit Relationship dialog tells us that the related columns are DimProduct.ProductSubcategoryKey and DimProductSubcategory.ProductSubcategoryKey.  We are therefore aware of the details of the relationship between the two tables via these columns.  The existence of a relationship will allow us to use the RELATED() function in a way that we can be certain that the data it retrieves is accurate.
  1. Click OK to dismiss the Edit Relationship dialog.
  2. Click the Close button in the Manage Relationships dialog.
We return to the PowerPivot window – DimProduct tab.  Now let’s get some exposure to the RELATED() function.
  1. Scroll over to, and click, the header of the column labeled Add Column, this time to the immediate right of the calculated column we added earlier, ProductLabel.
The Formula bar (fx) is activated, once again, upon selection of the column.
  1. Type (or copy and paste)  the following into the Formula bar:
=RELATED(D
  1. Select the following from the Autosense-enabled selector that appears:
DimProductSubcategory[ProductSubcategoryAlternateKey]
as shown in this partial view:
Illustration 17:  Selecting a Column in a Related Table …
  1. Once the selection appears in the formula bar, add a right parenthesis symbol ( “ )” )  to close the expression, which will then appear in the formula bar as shown:
Illustration 18:  The Completed Expression …
Note that we provide the table name before the column name, as the column resides in another table. 
  1. Press the ENTER key.
The DAX formula containing the RELATED() function is accepted and the calculation populates the rows where a SubCategoryAlternateKey exists, leaving blank those where one does not (we can scroll down to see populated rows lower in the tab).
  1. Double click the column header of the calculated column, as we did with our first calculated column.
  2. Type (or copy and paste)  the following into the column heading:
SubCatAltKey
  1. Press the ENTER key.
The column header now displays the desired title.
Illustration 19:  The SubCatAltKey Calculated Column Appears …
Let’s add a couple more columns to reinforce our understanding of basic DAX formula creation and the RELATED() function.
  1. Click the column header labeled Add Column to the immediate right of the newly added SubCatAltKey calculated column.
  1. Type (use Autosense, or copy and paste, as desired)  the following into the Formula bar:
=RELATED(DimProductSubcategory[EnglishProductSubcategoryName])
  1. Press the ENTER key, once again.
The DAX formula containing the RELATED() function is again accepted and the calculation populates the rows where an instance of EnglishProductSubcategoryName exists, leaving blank those where one does not, as before.
  1. Double click the column header of the newly added calculated column, once again.
  2. Type (or copy and paste)  the following into the column heading:
SubCatName
  1. Press the ENTER key.
The column header now displays the desired title, as in previous steps.
Finally, let’s insert column to create a label combining the new SubCatAltKey and SubCatName columns in a manner similar to the way we created our first calculated column, ProductLabel.
  1. Click the column labeled Add Column to the immediate right of the newly added SubCatName calculated column.
  1. Type (or copy and paste, as desired)  the following into the Formula bar:
 = [SubCatAltKey]  & " - " &  [SubCatName]
  1. Press the ENTER key, once again.
The DAX formula containing the RELATED() function is again accepted and the calculation populates the rows, leaving only a solitary hyphen ( “-“ ) in the rows where the SubCatAltKey and SubCatName columns are empty.
  1. Double click the column header of the newly added calculated column, once again.
  2. Type (or copy and paste)  the following into the column heading:
SubCatLabel
  1. Press the ENTER key.
The column header now displays the desired title, as in previous steps.  The new calculated column appears, alongside the calculated columns combined to create it, as partially depicted (for a sample of populated rows) below.
Illustration 20:  The SubCatLabel Calculated Column Appears …
And so we see that our new calculated column delivers the desired results.   Let’s save the PowerPivot work to date, along with spreadsheet with which it is associated.
  1. Using the downward pointing selector in the upper left corner of the PowerPivot window (to the immediate left of the Home and Design tabs), select Save As …, as shown.
Illustration 21:  Saving Our Work to Date …
  1. Name the file ST_DAX01-1.xlsx, and save it in a meaningful location.
  2. Exit Excel 2010 as desired.
We are now positioned to pull our PowerPivot data into Excel and perform analysis via a PivotTable, present data via a Pivot report, and so forth.
We will explore many functions, within the context of various formulas that we craft to meet representative business needs, as we move into subsequent steps’ coverage of DAX, and PowerPivot in general. A grasp of the functions, as well as agile practices surrounding the use of PowerPivot in query and modeling data, will be vital to success in our taking advantage of the vast opportunities that PowerPivot offers us.  Practice with these components will assure that their use comes as second nature, and will create a foundation from which the power and elegance of PowerPivot can be fully exploited.

Summary …

With this article, I introduced the Stairway to PowerPivot and DAX series.  I began by noting that the series is designed to provide hands-on introduction to the basics of PowerPivot and the Data Analysis Expression (DAX) language, with each level progressively exposing an individual function, technique or other component designed to help you to meet specific real-world needs.  We noted that our primary objective within this series is to make those new to PowerPivot more comfortable with creating useful queries, within a business context, using the PowerPivot / DAX combination. Our means to that end will be to obtain hands-on exposure to using PowerPivot in general, while achieving an understanding of the DAX language, using formulas we construct from the DAX functions we introduce within the levels of this Stairways series.
For each function we introduce within the levels of the series, we will discuss what it is designed to produce and its operation in doing so, the syntax with which it is employed and the data that it retrieves and presents.  I hope that this approach will encourage you to take the example code I present in the levels, and try it out, making changes and experimenting, so as to get the feel for the capabilities of DAX within PowerPivot. We’ll illustrate every point with practice examples of  business uses along with useful queries.
In this, our first level of Stairway to PowerPivot and DAX, we introduced PowerPivot and DAX, discussing what you need to have in place to get started with the series.  In preparation for the exercises to follow, we walked through the performance of an import, bringing in several tables from a sample SQL Server database that is readily available via sources we noted on the web.  We next discussed DAX formulas and functions in general, and moved rapidly into our first exposure to DAX.  We looked at the RELATED() function, and exposed a basic approach to using it in a simple scenario.  Finally, we discussed the results we obtained from using the function in our practice examples, and looked forward to similar steps with the many DAX functions in future levels.

Session Time Out Warning Message In jQuery




In this article we will look at how to show a warning message to the user prior to 15 minutes before the session time out, in other words if the user is in an idle state for more than 5 minutes then the user gets a message in the browser.
 
What is Session?
When we work with an application on our computer, we follow some procedures, like open an application, make some changes and then close it. These procedures are much like a Session because there is an interaction between the user and computer so the computer knows who the user is. It knows everythig we do with the application. However, on the internet there is one problem: the web server doesn't know who the user is and what he does, because the web is stateless, which means a new instance of a web page class is recreated each time the page is posted on the server. In other words to say that HTTP is a stateless protocol and it doesn't hold the user/client information in a page.
Session data is stored on a per client basis; in other words for every client, the session data is stored separately. A Session provides the ability to store information in server memory and that information can be retrieved for any page.

Use jQuery
We need to add a JavaScript js file reference on the web page so we can retrieve JavaScript functions on the web page to get a message before session timeout. So we load jQuery first.
<script src="jquery/js/jquery-1.8.3.js" type="text/javascript"></script>

Variable Declaration
We need to declare some global variables; these are used to calculate the user's idle time and session time.
//How frequently to check for session expiration in milliseconds
var
sess_pollInterval = 60000;
//How many minutes the session is valid for

var
sess_expirationMinutes = 20;
//How many minutes before the warning prompt

var sess_warningMinutes = 5;
var
sess_intervalID;
var
sess_lastActivity;

Session Initialization
 

We need to create a JavaScript function that initializes the session time when the page loads. Here we use the sess_lastActivity variable to store the time the page is loaded
and after that we set the session interval.

function initSession()
{  
    sess_lastActivity = new Date();
    sessSetInterval();
    $(document).bind('keypress.session', function (ed, e)
    {
        sessKeyPressed(ed, e);
    });
}

Session Interval and Warning Message
We create two functions; one to set session interval and another to calculate session interval and to show a message whenever a session times out.

function sessSetInterval()
{
    sess_intervalID = setInterval('sessInterval()', sess_pollInterval);
}

function
sessInterval()
{
        var now = new Date();
       
//get milliseconds of differneces
        var diff = now - sess_lastActivity;
       
//get minutes between differences
        var diffMins = (diff / 1000 / 60); 
        if (diffMins >= sess_warningMinutes)
        {
           
//wran before expiring
            //stop the timer
            sessClearInterval();
           
//promt for attention
            var active = confirm('Your session will expire in ' + (sess_expirationMinutes - sess_warningMinutes) +
                ' minutes (as of ' + now.toTimeString() + '), press OK to remain logged in ' +
                'or press Cancel to log off. \nIf you are logged off any changes will be lost.');
            if (active == true)
            {
                now = new Date(); 
              diff = now - sess_lastActivity;
                diffMins = (diff / 1000 / 60);

                if (diffMins > sess_expirationMinutes)
                {
                    sessLogOut();
                }
               
else
                {
                    initSession();
                    sessSetInterval();
                    sess_lastActivity = new Date();
                }
            }
           
else
            {
                sessLogOut();
            }
        }
    }

Logout
When the user is idle for more than 5 minutes we get an attention message from the system and whenever we click on the cancel button we move to the logout page. After 15
minutes from the time the message window was opened, we click on the OK button and then also move to the logout page. So we create a logout function, which is:

function sessLogOut()
    {
        window.location.href = 'Logout.aspx';
    }

Directory Structure
The following diagram shows the directory structure of where the web form and js files exist.
solution-explorer-in-jquery.png

Entire Code
<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="SessionInjQuery.aspx.cs" Inherits="JQueryExample.SessionInjQuery" %>
<!
DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<
html xmlns="http://www.w3.org/1999/xhtml">
<
head runat="server">
    <title></title
>
    <script src="jquery/js/jquery-1.8.3.js" type="text/javascript"></script> 
    <script type
="text/javascript">
 
    var sess_pollInterval = 60000;
    var sess_expirationMinutes = 20;
    var sess_warningMinutes = 5;
    var sess_intervalID;
    var sess_lastActivity;

 
    function initSession()
    {  
        sess_lastActivity = new Date();
        sessSetInterval();
        $(document).bind('keypress.session', function (ed, e)
        {
            sessKeyPressed(ed, e);
        });
    }
    function sessSetInterval()
    {
        sess_intervalID = setInterval('sessInterval()', sess_pollInterval);
    }
    function sessClearInterval()
    {
        clearInterval(sess_intervalID);
    }
    function sessKeyPressed(ed, e)
    {
        sess_lastActivity = new Date();
    }
    function sessLogOut()
    {
        window.location.href = 'Logout.aspx';
    }
    function sessInterval()   
{
        var now = new Date();
       
//get milliseconds of differneces
        var diff = now - sess_lastActivity;
       
//get minutes between differences
        var diffMins = (diff / 1000 / 60);

        if (diffMins >= sess_warningMinutes)
        {
           
//wran before expiring
           
//stop the timer
            sessClearInterval();
           
//promt for attention
            var active = confirm('Your session will expire in ' + (sess_expirationMinutes - sess_warningMinutes) +
                ' minutes (as of ' + now.toTimeString() + '), press OK to remain logged in ' +
                'or press Cancel to log off. \nIf you are logged off any changes will be lost.');
            if (active == true)
            {
                now = new Date();
                diff = now - sess_lastActivity;
                diffMins = (diff / 1000 / 60);

                if (diffMins > sess_expirationMinutes)
                {
                    sessLogOut();
                }
               
else
                {
                    initSession();
                    sessSetInterval();
                    sess_lastActivity = new Date();
                }
            }
           
else
            {
                sessLogOut();
            }
        }
    }

</script>
 
</
head>
<
body onload="initSession()">
    <form id="form1" runat
="server">
    <div
>
     <h1>Your Most Welcome!</h1
>
    </div
>
    </form
>

</
body>
</
html>

Warning Message
message-box-in-jquery.png

In the warning message we have two buttons, one is "OK" and another is a "Cancel" button. When we click the "OK" button withn 15 minutes before the warning message is shown in the browser, the session time interval will be reset but when we click on the "OK" button after 15 minutes from the warning message then we move to the logout page. Whenever we click on the "Cancel" button then we also move to the logout page.

meaasge-print-injquery.png