Search Results
Search found 1323 results on 53 pages for 'dr giles m'.
Page 29/53 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36 | Next Page >
-
-
Presented By:
Read the article
-
Presented By:
Read the article
-
Presented By:
Read the article
-
Presented By:
Read the article
-
Presented By:
Read the article
-
System Virtualization
Methods and applications of system virtualization using Intel virtualization technology Intel Corporation - Intel - Companies - Product Support - HardwareRead the article
-
Custom Parallel Partitioning With .NET 4
Programming - Parallel computing - Component Frameworks - NET - ToolsRead the article
-
Presented By:
Read the article
-
Presented By:
Read the article
-
Presented By:
Read the article
-
iPad Development: It's Different in an Exciting Sort of Way
The power of a desktop and the portability of mobile -- all in the same device Fujitsu iPAD - Programming - ipad - Business - LanguagesRead the article
-
A Portable Security Risk
Ubiquity of personal devices with built in web connectivity, office applications, and email fraught with risks to businesses Business - Business Services - Ubiquity - Mozilla Firefox - Aza RaskinRead the article
-
Parallelism: Who Cares?
Parallel computing - Programming - Languages - Hardware - C++Read the article
-
Presented By:
Read the article
-
TeraGrid Application Deadline Approaches
Time requests must be in by April 15 Business - Management Science - Management - Management Information Systems - Call For PapersRead the article
-
The iPhone Isn't Easy
How to get started building an app iPhone - Smartphone - Handhelds - App Store - AndroidRead the article
-
Presented By:
Read the article
-
Presented By:
Read the article
-
SQLAuthority News – Meeting SQL Friends – SQLPASS 2011 Event Log
- by pinaldave
One of the biggest reason I go to SQLPASS is that my friends are going there too. There are so many friends with whom I often talk on Facebook and Twitter but I rarely get time to meet them as well talk with them. One thing I am usually sure that many fo them will be for sure attend SQLPASS. This is one event which every SQL Server Enthusiast should attend. Just like everybody I had pleasant time to meet many of my SQL friends. There were so many friends that I met and I did not click photo. There were so many friends who clicked photo in their camera and I do not have them. Here are 1% of the photos which I have. If you are not in the photo, it does not mean I have less respect to our friendship. Please post link to our photo together :) I was very fortunate that I was able to snap a quick photograph with Pinal Dave with Dr. David DeWitt. I stood outside of the hall waiting for Dr. to show up and when he was heading down from convention center I requested him if I can have one photo for my memory lane and very politely he agreed to have one. It indeed made my day! Pinal Dave with Dr. David DeWitt Every single time I met Steve, I make sure I have one photo for my memory. Steve is so kind every single time. If you know SQL and do not know Steve Jones, you do not know SQL (IMHO). Following is the photograph with Michael McLean. More details about this photo in future blog post! Pinal Dave, Michael McLean, and Rick Morelan Arnie always shares his wisdom with me. I still remember when I very first time visited USA, I was standing alone in corner and Arnie walked to me and introduced to every single person he know. Talking to Arnie is always pleasure and inspiring. Arnie Rowland and Pinal Dave I am now published author and have written two books so far. I am fortunate to have Rick Morelan as Co-author of both of my books. He is great guy and very easy to become friends with. I am very much impressed by him and his kindness during book co-authoring. Here is very first of our photograph together at SQLPASS. Rick Morelan and Pinal Dave Diego Nogare and I have been talking for long time on twitter and on various social media channels. I finally got chance to meet my friend from Brazil. It was excellent experience to meet a friend whom one wants to meet for long time and had never got chance earlier. Buck Woody – who does not know Buck. He is funny, kind and most important friends of every one. Buck is so kind that he does not hesitate to approach people even though he is famous and most known in community. Every time I meet him I learn something. He is always smiling and approachable. Pinal Dave and Buck Woddy Rushabh Mehta is current SQL PASS president and personal friend. He has always smiling face and tremendous love for SQL community. I often wonder where he gets all the time for all the time and efforts he puts in for community. I never miss a chance to meet and greet him. Even though he is renowned SQL Guru and extremely busy person – every single time I meet him he always asks me – “How is Nupur and Shaivi?” He even remembers my wife and daughters name. I am touched. Rushabh Mehta and Pinal Dave Nigel Sammy has extremely well sense of humor and passion from community. We have excellent synergy while we are together. The attached photo is taken while I was talking to him on Seattle Shoreline about SQL. Pinal Dave and Nigel Sammy Rick Morelan wanted my this trip to be memorable. I am vegetarian and I told him that I do not like Seafood. Well, to prove the point, he took me to fantastic Seafood restaurant in Seattle and treated me with mouth watering vegetarian dishes. I think when I go to Seattle next time, I am going to make him to take me again to the same place. Rick, Rushabh, Pinal and Paras Well, this is a short summary of few of the friends I met at Seattle. What is the life without friends, eh? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, TechnologyRead the article
-
High Availability for IaaS, PaaS and SaaS in the Cloud
- by BuckWoody
Outages, natural disasters and unforeseen events have proved that even in a distributed architecture, you need to plan for High Availability (HA). In this entry I'll explain a few considerations for HA within Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). In a separate post I'll talk more about Disaster Recovery (DR), since each paradigm has a different way to handle that. Planning for HA in IaaS IaaS involves Virtual Machines - so in effect, an HA strategy here takes on many of the same characteristics as it would on-premises. The primary difference is that the vendor controls the hardware, so you need to verify what they do for things like local redundancy and so on from the hardware perspective. As far as what you can control and plan for, the primary factors fall into three areas: multiple instances, geographical dispersion and task-switching. In almost every cloud vendor I've studied, to ensure your application will be protected by any level of HA, you need to have at least two of the Instances (VM's) running. This makes sense, but you might assume that the vendor just takes care of that for you - they don't. If a single VM goes down (for whatever reason) then the access to it is lost. Depending on multiple factors, you might be able to recover the data, but you should assume that you can't. You should keep a sync to another location (perhaps the vendor's storage system in another geographic datacenter or to a local location) to ensure you can continue to serve your clients. You'll also need to host the same VM's in another geographical location. Everything from a vendor outage to a network path problem could prevent your users from reaching the system, so you need to have multiple locations to handle this. This means that you'll have to figure out how to manage state between the geo's. If the system goes down in the middle of a transaction, you need to figure out what part of the process the system was in, and then re-create or transfer that state to the second set of systems. If you didn't write the software yourself, this is non-trivial. You'll also need a manual or automatic process to detect the failure and re-route the traffic to your secondary location. You could flip a DNS entry (if your application can tolerate that) or invoke another process to alias the first system to the second, such as load-balancing and so on. There are many options, but all of them involve coding the state into the application layer. If you've simply moved a state-ful application to VM's, you may not be able to easily implement an HA solution. Planning for HA in PaaS Implementing HA in PaaS is a bit simpler, since it's built on the concept of stateless applications deployment. Once again, you need at least two copies of each element in the solution (web roles, worker roles, etc.) to remain available in a single datacenter. Also, you need to deploy the application again in a separate geo, but the advantage here is that you could work out a "shared storage" model such that state is auto-balanced across the world. In fact, you don't have to maintain a "DR" site, the alternate location can be live and serving clients, and only take on extra load if the other site is not available. In Windows Azure, you can use the Traffic Manager service top route the requests as a type of auto balancer. Even with these benefits, I recommend a second backup of storage in another geographic location. Storage is inexpensive; and that second copy can be used for not only HA but DR. Planning for HA in SaaS In Software-as-a-Service (such as Office 365, or Hadoop in Windows Azure) You have far less control over the HA solution, although you still maintain the responsibility to ensure you have it. Since each SaaS is different, check with the vendor on the solution for HA - and make sure you understand what they do and what you are responsible for. They may have no HA for that solution, or pin it to a particular geo, or perhaps they have a massive HA built in with automatic load balancing (which is often the case). All of these options (with the exception of SaaS) involve higher costs for the design. Do not sacrifice reliability for cost - that will always cost you more in the end. Build in the redundancy and HA at the very outset of the project - if you try to tack it on later in the process the business will push back and potentially not implement HA. References: http://www.bing.com/search?q=windows+azure+High+Availability (each type of implementation is different, so I'm routing you to a search on the topic - look for the "Patterns and Practices" results for the area in Azure you're interested in)Read the article
-
How to Transform a user's search string into a MS SQL Full-Text Search Phrase
- by Atomiton
I've search for answers for this and I can't seem to find an answer to what should be somewhat simple. This is related to another question I asked, but it's different. What's the best way to take a user's search phrase and throw it into a CONTAINSTABLE(table, column, @phrase, topN ) phrase? Say, for example the user inputs: Books by "Dr. Seuss" What's the best way to turn that into something that will return results in my ContainsTAble() phrase? I was previously parsing the search phrase and writing something like ISABOUT("Books" WEIGHT(1.0), "by" WEIGHT(0.9), "Dr. Seuss" WEIGHT(0.8)) as my @phrase but ISABOUT seems to be returning odd results... especially when one word searches are entered. Any Ideas?Read the article
-
display data from json file in datagrid
- by kayn
I want to display data from a json files in a data grid using dojo ver 1.0.0. I am able to diplay the data when i declare it on my code but when i store the same data in a json format so i can reference it in my script,i get an empty grid. This is my json file; { data: [ ['10''myfile','Css', 'CS Degree','Dr. Bottoman','This is mine'], ['10'myfile2','CS716', 'CS Degree','Prof Frank', 'This is course'], ['10'myfile3 ','CS714', 'CS Degree', 'Dr. Ree', 'Welcome'], ['14', 'myfile4','CS772', 'CS Degree', 'Mr. Boss', 'This will display content' ], ['18', 'myfile5','CS774', 'CS Degree','Ms. Kirk', 'This is networks.' ] ] } and below is my code; @import "../../../dojo/resources/dojo.css"; @import "../_grid/Grid.css"; body { font-size: 1.0em; } #grid { height: 400px; border: 1px solid silver; } .text-oneline { white-space: nowrap; overflow: hidden; text-overflow: ellipsis; } .text-scrolling { height: 4em; overflow: auto; } .text-scrolling { width: 21.5em; } dojo.require("dojox.grid.Grid"); dojo.require("dojox.grid._data.model"); dojo.require("dojo.parser"); <script type="text/javascript"> /*<span dojoType="dojo.data.ItemFileWriteStore" jsId="myStore" url="course.json"> </span>*/ data = [ ['10''myfile','Css', 'CS Degree','Dr. Bottoman','This is mine'], ['10'myfile2','CS716', 'CS Degree','Prof Frank', 'This is course'], ['10'myfile3 ','CS714', 'CS Degree', 'Dr. Ree', 'Welcome'], ['14', 'myfile4','CS772', 'CS Degree', 'Mr. Boss', 'This will display content' ], ['18', 'myfile5','CS774', 'CS Degree','Ms. Kirk', 'This is networks.' ] ]; getDetailData = function(inRowIndex) { var row = data[this.grid.dataRow % data.length ]; switch (this.index) { case 0: return row[5]; case 1: return row[2]; case 2: return row[0]; case 3: return row[1]; case 4: return row[3]; case 5: return row[4]; default: return row[this.index]; } } getName = function(inRowIndex) { var row = data[inRowIndex % data.length]; return row[1]; } // Main grid structure var gridCells = [ { type: 'dojox.GridRowView', width: '20px' }, { onBeforeRow: function(inDataIndex, inSubRows) { inSubRows[1].hidden = !detailRows[inDataIndex]; }, cells: [[ { name: 'Master', width: 3, get: getCheck, styles: 'text-align: center;' }, { name: 'Detail', get: getName, width: 60 }, ], [ { name: '', get: getDetail, colSpan: 2, styles: 'padding: 0; margin: 0;'} ]] } ]; // html for the +/- cell function getCheck(inRowIndex) { var image = (detailRows[inRowIndex] ? 'open.gif' : 'closed.gif'); var show = (detailRows[inRowIndex] ? 'false' : 'true') return ''; } // provide html for the Detail cell in the master grid function getDetail(inRowIndex) { var cell = this; // we can affect styles and content here, but we have to wait to access actual nodes setTimeout(function() { buildDetailgrid(inRowIndex, cell); }, 1); // look for a Detailgrid var Detailgrid = dijit.byId(makeDetailgridId(inRowIndex)); var h = (Detailgrid ? Detailgrid.cacheHeight : "120") + "px"; // insert a placeholder return ''; } // the Detail cell contains a Detailgrid which we set up below var DetailgridCells = [{ noscroll: true, cells: [ [ {name: "Brief Course Description",width: "auto"}, {name: "Course Code" }, {name: "Credits" }, {name: "Subject" }, {name: "Prerequisite" }, {name: "Lecturer"}], [] ]}]; var DetailgridProps = { structure: DetailgridCells, rowCount: 1, autoHeight: true, autoRender: false, "get": getDetailData }; // identify Detailgrids by their row indices function makeDetailgridId(inRowIndex) { return grid.widgetId + "Detailgrid"/+ inRowIndex/; } // if a Detailgrid exists at inRowIndex, detach it from the DOM function detachDetailgrid(inRowIndex) { var Detailgrid = dijit.byId(makeDetailgridId(inRowIndex)); if (Detailgrid) dojox.grid.removeNode(Detailgrid.domNode); } // render a Detailgrid into inCell at inRowIndex function buildDetailgrid(inRowIndex, inCell) { var n = inCell.getNode(inRowIndex).firstChild; var id = makeDetailgridId(inRowIndex); var Detailgrid = dijit.byId(id); if (Detailgrid) { n.appendChild(Detailgrid.domNode); } else { DetailgridProps.dataRow = inRowIndex; DetailgridProps.widgetId = id; Detailgrid = new dojox.VirtualGrid(DetailgridProps, n); } if (Detailgrid) { Detailgrid.render(); Detailgrid.cacheHeight = Detailgrid.domNode.offsetHeight; inCell.grid.rowHeightChanged(inRowIndex); } } // destroy Detailgrid at inRowIndex function destroyDetailgrid(inRowIndex) { var Detailgrid = dijit.byId(makeDetailgridId(inRowIndex)); if (Detailgrid) Detailgrid.destroy(); } // when user clicks the +/- detailRows = []; function toggleDetail(inIndex, inShow) { if (!inShow) detachDetailgrid(inIndex); detailRows[inIndex] = inShow; grid.updateRow(inIndex); } dojo.addOnLoad(function() { window["grid"] = dijit.byId("grid"); dojo.connect(grid, 'rowRemoved', destroyDetailgrid); }); Test gridRead the article
-
Dropdownlist and Datareader
- by salvationishere
After trying many solutions listed on the internet I am very confused now. I have a C#/SQL web application for which I am simply trying to bind an ExecuteReader command to a Dropdownlist so the user can select a value. This is a VS2008 project on an XP OS. How it works is after the user selects a table, I use this selection as an input parameter to a method from my Datamatch.aspx.cs file. Then this Datamatch.aspx.cs file calls a method from my ADONET.cs class file. Finally this method executes a SQL procedure to return the list of columns from that table. (These are all tables in Adventureworks DB). I know that this method returns successfully the list of columns if I execute this SP in SSMS. However, I'm not sure how to tell if it works in VS or not. This should be simple. How can I do this? Here is some of my code. The T-sQL stored proc: CREATE PROCEDURE [dbo].[getColumnNames] @TableName VarChar(50) AS BEGIN SET NOCOUNT ON; SELECT col.name 'COLUMN_NAME' FROM sysobjects obj INNER JOIN syscolumns col ON obj.id = col.id WHERE obj.name = @TableName END It gives me desired output when I execute following from SSMS: exec getColumnNames 'AddressType' And the code from Datamatch.aspx.cs file currently is: // Add DropDownList Control to Placeholder private void CreateDropDownLists() { SqlDataReader dr2 = ADONET_methods.DisplayTableColumns(targettable); int NumControls = targettable.Length; DropDownList ddl = new DropDownList(); } Where ADONET_methods.DisplayTableColumns(targettable) is: public static SqlDataReader DisplayTableColumns(string tt) { SqlDataReader dr = null; string TableName = tt; string connString = "Server=(local);Database=AdventureWorks;Integrated Security = SSPI"; string errorMsg; SqlConnection conn2 = new SqlConnection(connString); SqlCommand cmd = new SqlCommand("getColumnNames"); //conn2.CreateCommand(); try { cmd.CommandType = CommandType.StoredProcedure; cmd.Connection = conn2; SqlParameter parm = new SqlParameter("@TableName", SqlDbType.VarChar); parm.Value = "Person." + TableName.Trim(); parm.Direction = ParameterDirection.Input; cmd.Parameters.Add(parm); conn2.Open(); dr = cmd.ExecuteReader(); } catch (Exception ex) { errorMsg = ex.Message; } return dr; }Read the article
-
ColdFusion structs Direct Assignment vs object literal notation.
- by Tom Hubbard
The newer versions of ColdFusion (I believe CF 8 and 9) allow you to create structs with object literal notation similar to JSON. My question is, are there specific benefits (execution efficiency maybe) to using object literal notation over individual assignments for data that is essentially static? For example: With individual assignments you would do something like this: var user = {}; user.Fname = "MyFirstnam"; user.Lname = "MyLastName"; user.titles = []; ArrayAppend(user.titles,'Mr'); ArrayAppend(user.titles,'Dr.'); Whereas with object literals you would do something like. var user = {Fname = "MyFirstnam", Lname = "MyLastName", titles = ['Mr','Dr']}; Now this limited example is admittedly simple, but if titles was an array of structures (Say an array of addresses), the literal notation becomes awkward to work with.Read the article