Search Results

Search found 17627 results on 706 pages for 'hierarchical query'.

Page 378/706 | < Previous Page | 374 375 376 377 378 379 380 381 382 383 384 385  | Next Page >

  • Google Search Parameter Question

    - by Brian
    I've been trying to determine different parameters used by Google in their search queries. In particular, the usg parameter is what is giving me troubles. Here is an example value given for it, which is from an actual Google query: usg=0_zDqudnCN52ATGjAl3tignXNtBo4%3D Does anyone know what it could be for / recognize it? I've done a bit of digging, but haven't found any confirmation as to what it could be. Here is the link that I took a look at: http://www.webmasterworld.com/google/3892573.htm

    Read the article

  • Should each page of a Blog listing have its own Title

    - by RandomBen
    Should example.com/Blog?Page=1, example.com/Blog?Page=2, etc have the same title? I have done some research on this and SEOMoz's tools say I have duplicate titles and so does Google's Webmaster tools. If you look at top end examples like http://www.seobook.com/blog and http://www.seomoz.org/blog they both use the same title across all query of their ?Page=X URLs. So what is the better choice or does it even matter?

    Read the article

  • T-SQL Equivalents for Microsoft Access VBA Functions

    If you need to migrate your Access application to SQL Server, don't count on The SQL Server Upsize Wizard in Microsoft Access to automatically convert your VBA functions. If you want to push the complex query processing done by your Access queries to the back end, you'll have to rewrite them in T-SQL.

    Read the article

  • Don&rsquo;t Forget! In-Memory Databases are Hot

    - by andrewbrust
    If you’re left scratching your head over SAP’s intention to acquire Sybase for almost $6 million, you’re not alone.  Despite Sybase’s 1990s reign as the supreme database standard in certain sectors (including Wall Street), the company’s flagship product has certainly fallen from grace.  Why would SAP pay a greater than 50% premium over Sybase’s closing price on the day of the announcement just to acquire a relational database which is firmly stuck in maintenance mode? Well there’s more to Sybase than the relational database product.  Take, for example, its mobile application platform.  It hit Gartner’s “Leaders’ Quadrant” in January of last year, and SAP needs a good mobile play.  Beyond the platform itself, Sybase has a slew of mobile services; click this link to look them over. There’s a second major asset that Sybase has though, and I wonder if it figured prominently into SAP’s bid: Sybase IQ.  Sybase IQ is a columnar database.  Columnar databases place values from a given database column contiguously, unlike conventional relational databases, which store all of a row’s data in close proximity.  Storing column values together works well in aggregation reporting scenarios, because the figures to be aggregated can be scanned in one efficient step.  It also makes for high rates of compression because values from a single column tend to be close to each other in magnitude and may contain long sequences of repeating values.  Highly compressible databases use much less disk storage and can be largely or wholly loaded into memory, resulting in lighting fast query performance.  For an ERP company like SAP, with its own legacy BI platform (SAP BW) and the entire range of Business Objects and Crystal Reports BI products (which it acquired in 2007) query performance is extremely important. And it’s a competitive necessity too.  QlikTech has built an entire company on a columnar, in-memory BI product (QlikView).  So too has startup company Vertica.  IBM’s TM1 product has been doing in-memory OLAP for years.  And guess who else has the in-memory religion?  Microsoft does, in the form of its new PowerPivot product.  I expect the technology in PowerPivot to become strategic to the full-blown SQL Server Analysis Services product and the entire Microsoft BI stack.  I sure don’t blame SAP for jumping on the in-memory bandwagon, if indeed the Sybase acquisition is, at least in part, motivated by that. It will be interesting to watch and see what SAP does with Sybase’s product line-up (assuming the acquisition closes), including the core database, the mobile platform, IQ, and even tools like PowerBuilder.  It is also fascinating to watch columnar’s encroachment on relational.  Perhaps this acquisition will be columnar’s tipping point and people will no longer see it as a fad.  Are you listening Larry Ellison?

    Read the article

  • Tips for Managing Complex Queries

    There are ways to structure a query that will minimize the complexity while raising confidence in the data returned. This article shares a few techniques that will help simplify the writing of complex queries.

    Read the article

  • using Generics in C# [closed]

    - by Uphaar Goyal
    I have started looking into using generics in C#. As an example what i have done is that I have an abstract class which implements generic methods. these generic methods take a sql query, a connection string and the Type T as parameters and then construct the data set, populate the object and return it back. This way each business object does not need to have a method to populate it with data or construct its data set. All we need to do is pass the type, the sql query and the connection string and these methods do the rest.I am providing the code sample here. I am just looking to discuss with people who might have a better solution to what i have done. using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Data; using System.Data.SqlClient; using MWTWorkUnitMgmtLib.Business; using System.Collections.ObjectModel; using System.Reflection; namespace MWTWorkUnitMgmtLib.TableGateway { public abstract class TableGateway { public TableGateway() { } protected abstract string GetConnection(); protected abstract string GetTableName(); public DataSet GetDataSetFromSql(string connectionString, string sql) { DataSet ds = null; using (SqlConnection connection = new SqlConnection(connectionString)) using (SqlCommand command = connection.CreateCommand()) { command.CommandText = sql; connection.Open(); using (ds = new DataSet()) using (SqlDataAdapter adapter = new SqlDataAdapter(command)) { adapter.Fill(ds); } } return ds; } public static bool ContainsColumnName(DataRow dr, string columnName) { return dr.Table.Columns.Contains(columnName); } public DataTable GetDataTable(string connString, string sql) { DataSet ds = GetDataSetFromSql(connString, sql); DataTable dt = null; if (ds != null) { if (ds.Tables.Count 0) { dt = ds.Tables[0]; } } return dt; } public T Construct(DataRow dr, T t) where T : class, new() { Type t1 = t.GetType(); PropertyInfo[] properties = t1.GetProperties(); foreach (PropertyInfo property in properties) { if (ContainsColumnName(dr, property.Name) && (dr[property.Name] != null)) property.SetValue(t, dr[property.Name], null); } return t; } public T GetByID(string connString, string sql, T t) where T : class, new() { DataTable dt = GetDataTable(connString, sql); DataRow dr = dt.Rows[0]; return Construct(dr, t); } public List GetAll(string connString, string sql, T t) where T : class, new() { List collection = new List(); DataTable dt = GetDataTable(connString, sql); foreach (DataRow dr in dt.Rows) collection.Add(Construct(dr, t)); return collection; } } }

    Read the article

  • 1 Million IOPS

    - by GrumpyOldDBA
    As a keen follower of storage performance I couldn't help but be drawn to this article in The Register http://www.theregister.co.uk/2010/04/14/lsi_million_iops/ this morning. I gave my 5 year old laptop a new lease of life with a SSD and in combination with the old drive made external managed to reduce the time of a demo query from 50 odd mins down to 6 mins. I also have 4 Silicon Power 32GB SSDs set up as a raid 0 on my home server, an overblown PC. http://www.futurestorage.co.uk/index.asp?selmanuf...(read more)

    Read the article

  • ODBC and Excel (2 replies)

    Hello, I am using the following connection to Query and Excel Spreadsheet: AConnectionString &quot;Driver {Microsoft Excel Driver (*.xls)};DriverId 790;Dbq &quot; &amp; ofdSelectFile.FileName &amp; &quot;;DefaultDir c:\;&quot; ASourceConnection New Odbc.OdbcConnection(AConnectionString) Dim ADataAdapter as new odbc.odbcDataAdapter(&quot;SELECT * FROM $Sheet1&quot;, ASourceConnection) ADataAdapter.Fill(MyDataset) This works Great, howe...

    Read the article

  • Discoverer 11g (11.1.1.2) Certified with E-Business Suite

    - by Steven Chan
    Discoverer is an ad-hoc query, reporting, analysis, and Web-publishing tool that allows end-users to work directly with Oracle E-Business Suite OLTP data.Discoverer 11g (11.1.1.2) is now certified with Oracle E-Business Suite Release 11i and 12.The latest release of Oracle Business Intelligence Discoverer 11g offers new functionality, including integration with Oracle Business Intelligence Enterprise Edition (OBIEE), published Discoverer Webservice APIs, integration with Oracle WebCenter, integration with Oracle WebLogic Server, integration with Enterprise Manager (Fusion Middleware Control) and improved performance and scalability.

    Read the article

  • SQL Server Scripts I Use

    - by Bill Graziano
    When I get to a new client I usually find myself using the same set of scripts for maintenance and troubleshooting.  These are all drop in solutions for various maintenance issues. Reindexing.  I use Michelle Ufford’s (SQLFool) re-indexing script.  I like that it has a throttle and only re-indexes when needed.  She also has a variety of other interesting scripts on her blog too. Server Activity.  Adam Machanic is up to version 10 of sp_WhoIsActive.  It’s a great replacement for the sp_who* stored procedures and does so much more.  If a server is acting funny this is one of the first tools I use. Backups.  Tara Kizer has a great little T-SQL script for SQL Server backups.  Wait Stats.  Paul Randal has a great script to display wait stats.  The biggest benefit for me is that his script filters out at least three dozen wait stats that I just don’t care about (for example LAZYWRITER_SLEEP). Update Statistics.  I didn’t find anything I liked so I wrote a simple script to update stats myself.  The big need for me was that it had to run inside a time window and update the oldest statistics first.  Is there a better one? Diagnostic Queries.  Glenn Berry has a huge collection of DMV queries available.  He also just highlighted five of them including two I really like dealing with unused indexes and suggested indexes. Single Use Query Plans.  Kim Tripp has a script that counts the number of single-use query plans.  This should guide you in whether to enable the Optimize for Adhoc Workloads option in SQL Server 2008. Granting Permissions to Developers.  This is one of those scripts I didn’t even know I needed until I needed it.  Kendra Little wrote it to grant a login read-only permission to all the databases.  It also grants view server state and a few other handy permissions.   What else do you use?  What should I add to my list?

    Read the article

  • Creating A SharePoint Parent/Child List Relationship&ndash; SharePoint 2010 Edition

    - by Mark Rackley
    Hey blog readers… It has been almost 2 years since I posted my most read blog on creating a Parent/Child list relationship in SharePoint 2007: Creating a SharePoint List Parent / Child Relationship - Out of the Box And then a year ago I improved on my method and redid the blog post… still for SharePoint 2007: Creating a SharePoint List Parent/Child Relationship – VIDEO REMIX Since then many of you have been asking me how to get this to work in SharePoint 2010, and frankly I have just not had time to look into it. I wish I could have jumped into this sooner, but have just recently began to look at it. Well.. after all this time I have actually come up with two solutions that work, neither of them are as clean as I’d like them to be, but I wanted to get something in your hands that you can start using today. Hopefully in the coming weeks and months I’ll be able to improve upon this further and give you guys some better options. For the most part, the process is identical to the 2007 process, but you have probably found out that the list view web parts in 2010 behave differently, and getting the Parent ID to your new child form can be a pain in the rear (at least that’s what I’ve discovered). Anyway, like I said, I have found a couple of solutions that work. If you know of a better one, please let us know as it bugs me that this not as eloquent as my 2007 implementation. Getting on the same page First thing I’d recommend is recreating this blog: Creating a SharePoint List Parent/Child Relationship – VIDEO REMIX in SharePoint 2010… There are some vague differences, but it’s basically the same…  Here’s a quick video of me doing this in SP 2010: Creating Lists necessary for this blog post Now that you have the lists created, lets set up the New Time form to use a QueryString variable to populate the Parent ID field: Creating parameters in Child’s new item form to set parent ID Did I talk fast enough through both of those videos? Hopefully by now that stuff is old hat to you, but I wanted to make sure everyone could get on the same page.  Okay… let’s get started. Solution 1 – XSLTListView with Javascript This solution is the more elegant of the two, however it does require the use of a little javascript.  The other solution does not use javascript, but it also doesn’t use the pretty new SP 2010 pop-ups.  I’ll let you decide which you like better. The basic steps of this solution are: Inserted a Related Item View Insert a ContentEditorWebPart Insert script in ContentEditorWebPart that pulls the ID from the Query string and calls the method to insert a new item on the child entry form Hide the toolbar from data view to remove “add new item” link. Again, you don’t HAVE to use a CEWP, you could just put the javascript directly in the page using SPD.  Anyway, here is how I did it: Using Related Item View / JavaScript Here’s the JavaScript I used in my Content Editor Web Part: <script type="text/javascript"> function NewTime() { // Get the Query String values and split them out into the vals array var vals = new Object(); var qs = location.search.substring(1, location.search.length); var args = qs.split("&"); for (var i=0; i < args.length; i++) { var nameVal = args[i].split("="); var temp = unescape(nameVal[1]).split('+'); nameVal[1] = temp.join(' '); vals[nameVal[0]] = nameVal[1]; } var issueID = vals["ID"]; //use this to bring up the pretty pop up NewItem2(event,"http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID=" + issueID); //use this to open a new window //window.location="http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID=" + issueID; } </script> Solution 2 – DataFormWebPart and exact same 2007 Process This solution is a little more of a hack, but it also MUCH more close to the process we did in SP 2007. So, if you don’t mind not having the pretty pop-up and prefer the comforts of what you are used to, you can give this one a try.  The basics steps are: Insert a DataFormWebPart instead of the List Data View Create a Parameter on DataFormWebPart to store “ID” Query String Variable Filter DataFormWebPart using Parameter Insert a link at bottom of DataForm Web part that points to the Child’s new item form and passes in the Parent Id using the Parameter. See.. like I told you, exact same process as in 2007 (except using the DataFormWeb Part). The DataFormWebPart also requires a lot more work to make it look “pretty” but it’s just table rows and cells, and can be configured pretty painlessly.  Here is that video: Using DataForm Web Part One quick update… if you change the link in this solution from: <tr> <td><a href="http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID={$IssueIDParam}">Click here to create new item...</a> </td> </tr> to: <tr> <td> <a href="javascript:NewItem2(event,'http://sp2010dev:1234/Lists/Time/NewForm.aspx?IssueID={$IssueIDParam}');">Click here to create new item...</a> </td> </tr> It will open up in the pretty pop up and act the same as solution one… So… both Solutions will now behave the same to the end user. Just depends on which you want to implement. That’s all for now… Remember in both solutions when you have them working, you can make the “IssueID” invisible to users by using the “ms-hidden” class (it’s my previous blog post on the subject up there). That’s basically all there is to it! No pithy or witty closing this time… I am sorry it took me so long to dive into this and I hope your questions are answered. As I become more polished myself I will try to come up with a cleaner solution that will make everyone happy… As always, thanks for taking the time to stop by.

    Read the article

  • ODBC and Excel (2 replies)

    Hello, I am using the following connection to Query and Excel Spreadsheet: AConnectionString &quot;Driver {Microsoft Excel Driver (*.xls)};DriverId 790;Dbq &quot; &amp; ofdSelectFile.FileName &amp; &quot;;DefaultDir c:\;&quot; ASourceConnection New Odbc.OdbcConnection(AConnectionString) Dim ADataAdapter as new odbc.odbcDataAdapter(&quot;SELECT * FROM $Sheet1&quot;, ASourceConnection) ADataAdapter.Fill(MyDataset) This works Great, howe...

    Read the article

  • Website Health Check - Keyword Blunders - Part 2

    Website Health Check is becoming an integral part of Search Engine Optimization (SEO). The reason is that it helps to find mistakes in websites that are commonly unnoticed. Eventually, it means the difference between coming up as the first or last result in a search engine query. A small part regarding keywords, such as keyword density, keyword stuffing, spelling errors, cloaking, etc., is explained here.

    Read the article

  • Silverlight Cream for March 06, 2010 -- #808

    - by Dave Campbell
    In this Issue: András Velvárt, felix corke, Colin Eberhardt, Christopher Bennage, Gergely Orosz, Entity Spaces Team Blog, Mike Taulty(-2-), Jit Ghosh, and Jesse Liberty. Shoutouts: Jeremy Likness expands on the Silverlight Team's post Vancouver Olympics - How'd We Do That? Gavin Wignall has a post up Creating a 360 photograph of an object with Silverlight Photosynth From SilverlightCream.com: Transforming an Ugly Duckling into a Graceful Swan With Expression Blend and Silverlight - Part 2 Intro Animation András Velvárt has part 2 of his Transformation series up at SilverlightShow... he's taking the initro animation to a new length, allowing playback even... cool video tutorial! Free Silverlight 4 beta skin! felix corke has a Silerlight 4 theme up for us all to use. If you like a dark theme like Blend, you'll like this... I like it! Linq to Visual Tree Colin Eberhardt has a great tutorial up for using LINQ to query the WPF or Silverlight Visual Tree while retaining the tree structure. He also has links out to other techniques. XAML Attributes on Separate Lines Christopher Bennage has a post up showing how to easily get all your XAML attributes on separate lines using a VS menu option... I didn't know that! Using built-in, embedded and streamed fonts in Silverlight Gergely Orosz has a post up at ScottLogic going over Fonts in Silverlight -- built-in, embedded, or streamed, and examples with code. EntitySpaces 2010 Two Part Series on Silverlight and WCF Entity Spaces Team Blog has a pair of videos up on Entity Spaces 2010, WCF, and Silverlight. Part 1 is the intro and explanation, part 2 is a full-up app demonstrating it. MEF, Silverlight and the DeploymentCatalog In an attempt to respond fully to a query, Mike Taulty literally pushed the record button and took off on what became a tutorial video on building a real Silverlight app utilizing MEF. Silverlight 4, Experiment with Pluggable Navigation and a WCF Data Service Mike Taulty has an experiment detailed on his blog about pluggable navigation and Silverlight 4. He walks through the history of how we got to this point then takes on in an example... good external links too Enhancing Silverlight Video Experiences with Contextual Data This is a post on the MSDN Magazine site where Jit Ghosh has a great long post about not only Smooth Streaming with Silverlight, but also adding context data to your video. When Is It OK To Hack? Read what all Jesse Liberty gets involved in when he's trying to get something out the door and has to work around a problem. Just about as interesting are the comments ... check it out and leave your own! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    MIX10

    Read the article

  • A pseudo-listener for AlwaysOn Availability Groups for SQL Server virtual machines running in Azure

    - by MikeD
    I am involved in a project that is implementing SharePoint 2013 on virtual machines hosted in Azure. The back end data tier consists of two Azure VMs running SQL Server 2012, with the SharePoint databases contained in an AlwaysOn Availability Group. I used this "Tutorial: AlwaysOn Availability Groups in Windows Azure (GUI)" to help me implement this setup.Because Azure DHCP will not assign multiple unique IP addresses to the same VM, having an AG Listener in Azure is not currently supported.  I wanted to figure out another mechanism to support a "pseudo listener" of some sort. First, I created a CNAME (alias) record in the DNS zone with a short TTL (time to live) of 5 minutes (I may yet make this even shorter). The record represents a logical name (let's say the alias is SPSQL) of the server to connect to for the databases in the availability group (AG). When Server1 was hosting the primary replica of the AG, I would set the CNAME of SPSQL to be SERVER1. When the AG failed over to Server1, I wanted to set the CNAME to SERVER2. Seemed simple enough.(It's important to point out that the connection strings for my SharePoint services should use the CNAME alias, and not the actual server name. This whole thing falls apart otherwise.)To accomplish this, I created identical SQL Agent Jobs on Server1 and Server2, with two steps:1. Step 1: Determine if this server is hosting the primary replica.This is a TSQL step using this script:declare @agName sysname = 'AGTest'set nocount on declare @primaryReplica sysnameselect @primaryReplica = agState.primary_replicafrom sys.dm_hadr_availability_group_states agState   join sys.availability_groups ag on agstate.group_id = ag.group_id   where ag.name = @AGname if not exists(   select *    from sys.dm_hadr_availability_group_states agState   join sys.availability_groups ag on agstate.group_id = ag.group_id   where @@Servername = agstate.primary_replica    and ag.name = @AGname)begin   raiserror ('Primary replica of %s is not hosted on %s, it is hosted on %s',17,1,@Agname, @@Servername, @primaryReplica) endThis script determines if the primary replica value of the AG group is the same as the server name, which means that our server is hosting the current AG (you should update the value of the @AgName variable to the name of your AG). If this is true, I want the DNS alias to point to this server. If the current server is not hosting the primary replica, then the script raises an error. Also, if the script can't be executed because it cannot connect to the server, that also will generate an error. For the job step settings, I set the On Failure option to "Quit the job reporting success". The next step in the job will set the DNS alias to this server name, and I only want to do that if I know that it is the current primary replica, otherwise I don't want to do anything. I also include the step output in the job history so I can see the error message.Job Step 2: Update the CNAME entry in DNS with this server's name.I used a PowerShell script to accomplish this:$cname = "SPSQL.contoso.com"$query = "Select * from MicrosoftDNS_CNAMEType"$dns1 = "dc01.contoso.com"$dns2 = "dc02.contoso.com"if ((Test-Connection -ComputerName $dns1 -Count 1 -Quiet) -eq $true){    $dnsServer = $dns1}elseif ((Test-Connection -ComputerName $dns2 -Count 1 -Quiet) -eq $true) {   $dnsServer = $dns2}else{  $msg = "Unable to connect to DNS servers: " + $dns1 + ", " + $dns2   Throw $msg}$record = Get-WmiObject -Namespace "root\microsoftdns" -Query $query -ComputerName $dnsServer  | ? { $_.Ownername -match $cname }$thisServer = [System.Net.Dns]::GetHostEntry("LocalHost").HostName + "."$currentServer = $record.RecordData if ($currentServer -eq $thisServer ) {     $cname + " CNAME is up to date: " + $currentServer}else{    $cname + " CNAME is being updated to " + $thisServer + ". It was " + $currentServer    $record.RecordData = $thisServer    $record.put()}This script does a few things:finds a responsive domain controller (Test-Connection does a ping and returns a Boolean value if you specify the -Quiet parameter)makes a WMI call to the domain controller to get the current CNAME record value (Get-WmiObject)gets the FQDN of this server (GetHostEntry)checks if the CNAME record is correct and updates it if necessary(You should update the values of the variables $cname, $dns1 and $dns2 for your environment.)Since my domain controllers are also hosted in Azure VMs, either one of them could be down at any point in time, so I need to find a DC that is responsive before attempting the DNS call. The other little thing here is that the CNAME record contains the FQDN of a machine, plus it ends with a period. So the comparison of the CNAME record has to take the trailing period into account. When I tested this step, I was getting ACCESS DENIED responses from PowerShell for the Get-WmiObject cmdlet that does a remote lookup on the DC. This occurred because the SQL Agent service account was not a member of the Domain Admins group, so I decided to create a SQL Credential to store the credentials for a domain administrator account and use it as a PowerShell proxy (rather than give the service account Domain Admins membership).In SQL Management Studio, right click on the Credentials node (under the server's Security node), and choose New Credential...Then, under SQL Agent-->Proxies, right click on the PowerShell node and choose New Proxy...Finally, in the job step properties for the PowerShell step, select the new proxy in the Run As drop down.I created this two step Job on both nodes of the Availability Group, but if you had more than two nodes, just create the same job on all the servers. I set the schedule for the job to execute every minute.When the server that is hosting the primary replica is running the job, the job history looks like this:The job history on the secondary server looks like this: When a failover occurs, the SQL Agent job on the new primary replica will detect that the CNAME needs to be updated within a minute. Based on the TTL of the CNAME (which I said at the beginning was 5 minutes), the SharePoint servers will get the new alias within five minutes and should be able to reconnect. I may want to shorten up the TTL to reduce the time it takes for the client connections to use the new alias. Using a DNS CNAME and a SQL Agent Job on all servers hosting AG replicas, I was able to create a pseudo-listener to automatically change the name of the server that was hosting the primary replica, for a scenario where I cannot use a regular AG listener (in this case, because the servers are all hosted in Azure).    

    Read the article

  • How to internally rewrite a page when requested from specific HTTP_HOST

    - by Andy
    Hi all, I have a Drupal site, site.com, and our client has a campaign that they're promoting for which they've bought a new domain name, campaign.com. I'd like it so that a request for campaign.com internally rewrites to a particular page of the Drupal site. Note Drupal uses an .htaccess file in the document root. The normal Drupal rewrite is # Rewrite URLs of the form 'x' to the form 'index.php?q=x'. RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_URI} !=/favicon.ico RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] I added the following before the normal rewrite. # Custom URLS (eg. microsites) go here RewriteCond %{HTTP_HOST} =campaign.com RewriteCond %{REQUEST_URI} =/ RewriteRule ^ index.php?q=node/22 [L] Unfortunately it doesn't work, it just shows the homepage. Turning on the rewrite log I get this. 1. [rid#2da8ea8/initial] (3) [perdir D:/wamp/www/] strip per-dir prefix: D:/wamp/www/ - 2. [rid#2da8ea8/initial] (3) [perdir D:/wamp/www/] applying pattern '^' to uri '' 3. [rid#2da8ea8/initial] (2) [perdir D:/wamp/www/] rewrite '' - 'index.php?q=node/22' 4. [rid#2da8ea8/initial] (3) split uri=index.php?q=node/22 - uri=index.php, args=q=node/22 5. [rid#2da8ea8/initial] (3) [perdir D:/wamp/www/] add per-dir prefix: index.php - D:/wamp/www/index.php 6. [rid#2da8ea8/initial] (2) [perdir D:/wamp/www/] strip document_root prefix: D:/wamp/www/index.php - /index.php 7. [rid#2da8ea8/initial] (1) [perdir D:/wamp/www/] internal redirect with /index.php [INTERNAL REDIRECT] 8. [rid#2da7770/initial/redir#1] (3) [perdir D:/wamp/www/] strip per-dir prefix: D:/wamp/www/index.php - index.php 9. [rid#2da7770/initial/redir#1] (3) [perdir D:/wamp/www/] applying pattern '^' to uri 'index.php' 10.[rid#2da7770/initial/redir#1] (3) [perdir D:/wamp/www/] strip per-dir prefix: D:/wamp/www/index.php - index.php 11.[rid#2da7770/initial/redir#1] (3) [perdir D:/wamp/www/] applying pattern '^(.*)$' to uri 'index.php' 12.[rid#2da7770/initial/redir#1] (1) [perdir D:/wamp/www/] pass through D:/wamp/www/index.php I'm not used to mod_rewrite, so I might be missing something, but comparing the logs from a call to http://site.com/node/3 and from http://campaign.com/ I can't see any meaningful difference. Specifically uri and args on line 4 seem correct, the internal redirect on line 7 seems right, and the pass through on line 12 seems right (because the file index.php exists). But for some reason it seems the query string's been discarded/ignored around the time of the internal redirect. I'm completely stumped. Also, if anyone could provide a reference on understanding the rewrite log, that might help. It'd be great if there's a way to track the query string through the internal redirect. FWIW I'm using WampServer 2.1 with Apache 2.2.17.

    Read the article

  • SQL Windowing screencast session for Cuppa Corner - rolling totals, data cleansing

    - by tonyrogerson
    In this 10 minute screencast I go through the basics of what I term windowing, which is basically the technique of filtering to a set of rows given a specific value, for instance a Sub-Query that aggregates or a join that returns more than just one row (for instance on a one to one relationship). http://sqlserverfaq.com/content/SQL-Basic-Windowing-using-Joins.aspx SQL below... USE tempdb go CREATE TABLE RollingTotals_Nesting ( client_id int not null, transaction_date date not null, transaction_amount...(read more)

    Read the article

  • TSQL Challenge 28 - SELECT TOP N articles from each category from a SQL Server 2000 database

    The challenge is to write a query that returns the articles to be displayed in the home page of the website. N number of articles from each category is to be selected where N is configured in the Categories table. Each category should select the most recent N articles. ArticleID can be used to identify the most recent articles. An article with a higher number indicates a more recent article.

    Read the article

  • Sending Parameters with the BizTalk HTTP Adapter

    - by Christopher House
    I've never had occaison to use the BizTalk HTTP adapter since I've always needed SOAP rather than just POX (plain old XML).  Yesterday we decided that we're going to expose some data via a Java servlet that will accept an HTTP post and respond with POX.  I knew BizTalk had an HTTP adapter but I had no idea what it's capabilities were. After a quick read through the BizTalk docs, it was apparent that the HTTP send adapter does in fact do posts.  The concern I had though was how we were going to supply parameters to the servlet.  The examples I had seen using the HTTP adapter all involved posting an XML message to some HTTP location.  Our Java guy, however didn't want to take that approach.  He wanted us to provide a query string via post, much like you'd expect to see on an HTTP get.  I decided to put together a little test scenario and see what I could come up with.  We didn't have a test servlet I could go against and my Java experience is virtually nill, so I decided to put together an ASP.Net project to act as the servlet.  It didn't need to be fancy, just one HttpHandler that accepts a post, reads a parameter and returns XML.  With the HttpHandler done, I put together a simple orchestration to send a message to the handler.  I started by having the orch send a message of type System.String to see what it would look like when the handler received it. I set a breakpoint in my handler and kicked off the orchestration.  Below is what I saw: As I suspected, because of BizTalk's XML serialization, System.String was not going to work.  I thought back to my BizTalk 2004 days and I project I worked on that required sending HTML formatted emails via the SMTP adapter.  To acomplish that, I had used a .Net class with a custom serialization formatter that I got from a Microsoft sample.  The code for the class, RawString can be found here. I created a new class library with the RawString class as well as a static factory class, referenced that in my orchestration project and changed my message type from System.String to RawString.  Below is what the code in my message construction looks like: After deploying the updated orchestration, I fired it off again and checked the breakpoint in my HttpHandler.  This is what I saw: And there you have it.  The RawString message type allowed me to pass a query string in the HTTP post without wrapping it in XML.

    Read the article

  • Some date math fun

    - by drsql
    Note: This concept presented is pretty simple and I am not claiming that I am the first to do this… so if you were the one who came up with this, let me know and I will give you linkage Today I was needing to get the data for the current month for a query, and it hit me that I really didn’t have a good way to do this.  There were two common methods that people use: WHERE YEAR(MonthColumn) = YEAR(Getdate()) AND MONTH(MonthColumn) = MONTH(Getdate()) But this particular method is pretty horrible...(read more)

    Read the article

  • Some date math fun

    - by drsql
    Note: This concept presented is pretty simple and I am not claiming that I am the first to do this… so if you were the one who came up with this, let me know and I will give you linkage Today I was needing to get the data for the current month for a query, and it hit me that I really didn’t have a good way to do this.  There were two common methods that people use: WHERE YEAR(MonthColumn) = YEAR(Getdate()) AND MONTH(MonthColumn) = MONTH(Getdate()) But this particular method is pretty horrible...(read more)

    Read the article

  • Discoverer 11.1.1.4 Certified with E-Business Suite

    - by Steven Chan
    Oracle Business Intelligence Discoverer is an ad-hoc query, reporting, analysis, and Web-publishing tool that allows end-users to work directly with Oracle E-Business Suite OLTP data.Discoverer 11g (11.1.1.4) is now certified with Oracle E-Business Suite Release.  Discoverer 11.1.1.4 is part of Oracle Fusion Middleware 11g Release 1 Version 11.1.1.4.0, also known as FMW 11g Patchset 3.  Certified E-Business Suite releases are:EBS Release 11i 11.5.10.2 + ATG RUP 7 and higherEBS Release 12.0.6 and higherEBS Release 12.1.1 and higher

    Read the article

  • NoSQL is not about object databases

    - by Bertrand Le Roy
    NoSQL as a movement is an interesting beast. I kinda like that it’s negatively defined (I happen to belong myself to at least one other such a-community). It’s not in its roots about proposing one specific new silver bullet to kill an old problem. it’s about challenging the consensus. Actually, blindly and systematically replacing relational databases with object databases would just replace one set of issues with another. No, the point is to recognize that relational databases are not a universal answer -although they have been used as one for so long- and recognize instead that there’s a whole spectrum of data storage solutions out there. Why is it so hard to recognize, by the way? You are already using some of those other data storage solutions every day. Let me cite a few: The file system Active Directory XML / JSON documents The Web e-mail Logs Excel files EXIF blobs in your photos Relational databases And yes, object databases It’s just a fact of modern life. Notice by the way that most of the data that you use every day is unstructured and thus mostly unsuitable for relational storage. It really is more a matter of recognizing it: you are already doing NoSQL. So what happens when for any reason you need to simultaneously query two or more of these heterogeneous data stores? Well, you build an index of sorts combining them, and that’s what you query instead. Of course, there’s not much distance to travel from that to realizing that querying is better done when completely separated from storage. So why am I writing about this today? Well, that’s something I’ve been giving lots of thought, on and off, over the last ten years. When I built my first CMS all that time ago, one of the main problems my customers were facing was to manage and make sense of the mountain of unstructured data that was constituting most of their business. The central entity of that system was the file system because we were dealing with lots of Word documents, PDFs, OCR’d articles, photos and static web pages. We could have stored all that in SQL Server. It would have worked. Ew. I’m so glad we didn’t. Today, I’m working on Orchard (another CMS ;). It’s a pretty young project but already one of the questions we get the most is how to integrate existing data. One of the ideas I’ll be trying hard to sell to the rest of the team in the next few months is to completely split the querying from the storage. Not only does this provide great opportunities for performance optimizations, it gives you homogeneous access to heterogeneous and existing data sources. For free.

    Read the article

  • Portal And Content - Content Integration - Best Practices

    - by Stefan Krantz
    Lately we have seen an increase in projects that have failed to either get user friendly content integration or non satisfactory performance. Our intention is to mitigate any knowledge gap that our previous post might have left you with, therefore this post will repeat some recommendation or reference back to old useful post. Moreover this post will help you understand ground up how to design, architect and implement business enabled, responsive and performing portals with complex requirements on business centric information publishing. Design the Information Model The key to successful portal deployments is Information modeling, it's a key task to understand the use case you designing for, therefore I have designed a set of question you need to ask yourself or your customer: Question: Who will own the content, IT or Business? Answer: BusinessQuestion: Who will publish the content, IT or Business? Answer: BusinessQuestion: Will there be multiple publishers? Answer: YesQuestion: Are the publishers computer scientist?Answer: NoQuestion: How often do the information changes, daily, weekly, monthly?Answer: Daily, weekly If your answers to the questions matches at least 2, we strongly recommend you design your content with following principles: Divide your pages in to logical sections, where each section is marked with its purpose Assign capabilities to each section, does it contain text, images, formatting and/or is it static and is populated through other contextual information Select editor/design element type WYSIWYG - Rich Text Plain Text - non-format text Image - Image object Static List - static list of formatted informationDynamic Data List - assembled information from multiple data files through CMIS query The result of such design map could look like following below examples: Based on the outcome of the required elements in the design column 3 from the left you will now simply design a data model in WebCenter Content - Site Studio by creating a Region Definition structure matching your design requirements.For more information on how to create a Region definition see following post: Region Definition Post - note see instruction 7 for details. Each region definition can now be used to instantiate data files, a data file will hold the actual data for each element in the region definition. Another way you can see this is to compare the region definition as an extension to the metadata model in WebCenter Content for each data file item. Design content templates With a solid dependable information model we can now proceed to template creation and page design, in this phase focuses on how to place the content sections from the region definition on the page via a Content Presenter template. Remember by creating content presenter templates you will leverage the latest and most integrated technology WebCenter has to offer. This phase is much easier since the you already have the information model and design wire-frames to base the logic on, however there is still few considerations to pay attention to: Base the template on ADF and make only necessary exceptions to markup when required Leverage ADF design components for Tabs, Accordions and other similar components, this way the design in the content published areas will comply with other design areas based on custom ADF taskflows There is no performance impact when using meta data or region definition based data All data access regardless of type, metadata or xml data it can be accessed via the Content Presenter - Node. See below for applied examples on how to access data Access metadata property from Document - #{node.propertyMap['myProp'].value}myProp in this example can be for instance (dDocName, dDocTitle, xComments or any other available metadata) Access element data from data file xml - #{node.propertyMap['[Region Definition Name]:[Element name]'].asTextHtml}Region Definition Name is the expect region definition that the current data file is instantiatingElement name is the element value you like to grab from the data file I recommend you read following  useful post on content template topic:CMIS queries and template creation - note see instruction 9 for detailsStatic List template rendering For more information on templates:Single Item Content TemplateMulti Item Content TemplateExpression Language Internationalization Considerations When integrating content assets via content presenter you by now probably understand that the content item/data file is wired to the page, what is also pretty common at this stage is that the content item/data file only support one language since its not practical or business friendly to mix that into a complex structure. Therefore you will be left with a very common dilemma that you will have to either build a complete new portal for each locale, which is not an good option! However with little bit of information modeling and clear naming convention this can be addressed. Basically you can simply make sure that all content item/data file are named with a predictable naming convention like "Content1_EN" for the English rendition and "Content1_ES" for the Spanish rendition. This way through simple none complex customizations you will be able to dynamically switch the actual content item/data file just before rendering. By following proposed approach above you not only enable a simple mechanism for internationalized content you also preserve the functionality in the content presenter to support business accessible run-time publishing of information on existing and new pages. I recommend you read following useful post on Internationalization topics:Internationalize with Content Presenter Integrate with Review & Approval processes Today the Review and approval functionality and configuration is based out of WebCenter Content - Criteria Workflows. Criteria Workflows uses the metadata of the checked in document to evaluate if the document is under any review/approval process. So for instance if a Criteria Workflow is configured to force any documents with Version = "2" or "higher" and Content Type is "Instructions", any matching content item version on check in will now enter the workflow before getting released for general access. Few things to consider when configuring Criteria Workflows: Make sure to not trigger on version one for Content Items that are Data Files - if you trigger on version 1 you will not only approve an empty document you will also have a content presenter pointing to a none existing document - since the document will only be available after successful completion of the workflow Approval workflows sometimes requires more complex criteria, the recommendation if that is the case is that the meta data triggering such criteria is automatically populated, this can be achieved through many approaches including Content Profiles Criteria workflows are configured and managed in WebCenter Content Administration Applets where you can configure one or more workflows. When you configured Criteria workflows the Content Presenter will support the editors with the approval process directly inline in the "Contribution mode" of the portal. In addition to approve/reject and details of the task, the content presenter natively support the user to view the current and future version of the change he/she is approving. See below for example: Architectural recommendation To support review&approval processes - minimize the amount of data files per page Each CMIS query can consume significant time depending on the complexity of the query - minimize the amount of CMIS queries per page Use Content Presenter Templates based on ADF - this way you minimize the design considerations and optimize the usage of caching Implement the page in as few Data files as possible - simplifies publishing process, increases performance and simplifies release process Named data file (node) or list of named nodes when integrating to pages increases performance vs. querying for data Named data file (node) or list of named nodes when integrating to pages enables business centric page creation and publishing and reduces the need for IT department interaction Summary Just because one architectural decision solves a business problem it doesn't mean its the right one, when designing portals all architecture has to be in harmony and not impacting each other. For instance the most technical complex solution is not always the best since it will most likely defeat the business accessibility, performance or both, therefore the best approach is to first design for simplicity that even a non-technical user can operate, after that consider the performance impact and final look at the technology challenges these brings and workaround them first with out-of-the-box features, after that design and develop functions to complement the short comings.

    Read the article

  • Sesame update du jour: SL 4, OOB, Azure, and proxy support

    - by Fabrice Marguerie
    I've just published a new version of Sesame Data Browser. Here's what's new this time: Upgraded to Silverlight 4 Can run out-of-browser (OOB), with elevated permissions. This gives you an icon on your desktop and enables new scenarios. Note: The application is unsigned for the moment. Support for Windows Azure authentication Support for SQL Azure authentication If you are behind a proxy that requires authentication, just give Sesame a new try after clicking on "If you are behind a proxy that requires authentication, please click here" An icon and a button for closing connections are now displayed on connection tabsSome less visible improvements Here is the connection view with anonymous access: If you want to access Windows Azure tables as OData, all you have to do is use your table storage endpoint as the URL, and provide your access key: A Windows Azure table storage address looks like this: http://<your account>.table.core.windows.net/ If you want to browse your SQL Azure databases with Sesame, you have to enable OData support for them at https://www.sqlazurelabs.com/ConfigOData.aspx. I won't show how it works because it's already been done in several places over the Web. Here are pointers: OData.org: Got SQL Azure? Then you've got OData OakLeaf Systems: Enabling and Using the OData Protocol with SQL Azure Patrick Verbruggen: Creating an OData feed for your Azure databases Shawn Wildermuth: SQL Azure's OData Support Jack Greenfield: How to Use OData for SQL Azure with AppFabric Access Control You can choose to enable anonymous access or not. When you don't enable anonymous access, you have to provide an Issuer name and a Secret key, and optionally an Security Token Service (STS) endpoint: Excerpt from Jack Greenfield's blog: To enable OData access to the currently selected database, check the box labeled "Enable OData". When OData access is enabled, database user mapping information is displayed at the bottom of the form.Use the drop down list labeled "Anonymous Access User" to select an anonymous access user. If an anonymous access user is selected, then all queries against the database presented without credentials will execute by impersonating that user. You can access the database as the anonymous user by clicking on the link provided at the bottom of the page. If no anonymous access user is selected, then the OData Service will not allow anonymous access to the database.Click the link labeled "Add User" to add a user for authenticated access. In the pop up panel, select the user from the drop down list. Leave the issuer name empty for simple authentication, or provide the name of a trusted Security Token Service (STS) for federated authentication. For example, to federate with another ACS based STS, provide the base URI for the STS endpoint displayed by the Windows Azure AppFabric Portal for the STS.Click the "OK" button to complete the configuration process and dismiss the pop up panel. When one or more authenticated access users are added, the OData Service will impersonate them when appropriate credentials are presented. You can designate as many authenticated access users as you like. The OData Service will decide which one to impersonate for each query by inspecting the credentials presented with the query.Next time I'll give an overview of how Sesame Data Browser is built.In the meantime, happy data browsing!

    Read the article

< Previous Page | 374 375 376 377 378 379 380 381 382 383 384 385  | Next Page >