Search Results

Search found 31902 results on 1277 pages for 'sql backup'.

Page 630/1277 | < Previous Page | 626 627 628 629 630 631 632 633 634 635 636 637  | Next Page >

  • Is it possible to rsync your web site to another backup server and use the same .htaccess files?

    - by stephenmm
    I am trying to use rsync to replicate all the files from one web server to another server that could act as a backup if the first one went down. The problem I am having is that the .htaccess file requires the AuthUserFile to have the fully quallified path to the .htpasswd file and I cannot make the paths the same on the two machines. Does anyone know how I might use the same .htaccess file on two different servers? Thanks for any help that can be provided.

    Read the article

  • Can a FreePBX backup be restored to a different version?

    - by Tim Long
    I run a small PBX based on the FreePBX distro of Asterisk. The installation has been steadily upgraded but for various reasons, we want to start again on a new server with a clean install from the distribution media. Will I be able to take a backup from the old server and restore it to the new server, even though the installs are different versions? How sensitive are FreePBX backups to the build version? Is it possible to get at least a partial restore?

    Read the article

  • Unaccounted for database size

    - by Nazadus
    I currently have a database that is 20GB in size. I've run a few scripts which show on each tables size (and other incredibly useful information such as index stuff) and the biggest table is 1.1 million records which takes up 150MB of data. We have less than 50 tables most of which take up less than 1MB of data. After looking at the size of each table I don't understand why the database shouldn't be 1GB in size after a shrink. The amount of available free space that SqlServer (2005) reports is 0%. The log mode is set to simple. At this point my main concern is I feel like I have 19GB of unaccounted for used space. Is there something else I should look at? Normally I wouldn't care and would make this a passive research project except this particular situation calls for us to do a backup and restore on a weekly basis to put a copy on a satellite (which has no internet, so it must be done manually). I'd much rather copy 1GB (or even if it were down to 5GB!) than 20GB of data each week. sp_spaceused reports the following: Navigator-Production 19184.56 MB 3.02 MB And the second part of it: 19640872 KB 19512112 KB 108184 KB 20576 KB while I've found a few other scripts (such as the one from two of the server database size questions here, they all report the same information either found above or below). The script I am using is from SqlTeam. Here is the header info: * BigTables.sql * Bill Graziano (SQLTeam.com) * graz@<email removed> * v1.11 The top few tables show this (table, rows, reserved space, data, index, unused, etc): Activity 1143639 131 MB 89 MB 41768 KB 1648 KB 46% 1% EventAttendance 883261 90 MB 58 MB 32264 KB 328 KB 54% 0% Person 113437 31 MB 15 MB 15752 KB 912 KB 103% 3% HouseholdMember 113443 12 MB 6 MB 5224 KB 432 KB 82% 4% PostalAddress 48870 8 MB 6 MB 2200 KB 280 KB 36% 3% The rest of the tables are either the same in size or smaller. No more than 50 tables. Update 1: - All tables use unique identifiers. Usually an int incremented by 1 per row. I've also re-indexed everything. I ran the dbcc shrink command as well as updating the usage before and after. And over and over. An interesting thing I found is that when I restarted the server and confirmed no one was using it (and no maintenance procs are running, this is a very new application -- under a week old) and when I went to run the shrink, every now and then it would say something about data changed. Googling yielded too few useful answers with the obvious not applying (it was 1am and I disconnected everyone, so it seems impossible that was really the case). The data was migrated via C# code which basically looked at another server and brought things over. The quantity of deletes, at this point in time, are probably under 50k in rows. Even if those rows were the biggest rows, that wouldn't be more than 100M I would imagine. When I go to shrink via the GUI it reports 0% available to shrink, indicating that I've already gotten it as small as it thinks it can go. Update 2: sp_spaceused 'Activity' yields this (which seems right on the money): Activity 1143639 134488 KB 91072 KB 41768 KB 1648 KB Fill factor was 90. All primary keys are ints. Here is the command I used to 'updateusage': DBCC UPDATEUSAGE(0); Update 3: Per Edosoft's request: Image 111975 2407773 19262184 It appears as though the image table believes it's the 19GB portion. I don't understand what this means though. Is it really 19GB or is it misrepresented? Update 4: Talking to a co-worker and I found out that it's because of the pages, as someone else here has also state the potential for that. The only index on the image table is a clustered PK. Is this something I can fix or do I just have to deal with it? The regular script shows the Image table to be 6MB in size. Update 5: I think I'm just going to have to deal with it after further research. The images have been resized to be roughly 2-5KB each and on a normal file system doesn't consume much space but on SqlServer it seems to consume considerably more. The real answer, in the long run, will likely be separating that table in to another partition or something similar.

    Read the article

  • Building the Ultimate SharePoint 2010 Development Environment

    - by Manesh Karunakaran
    It’s been more than a month since SharePoint 2010 RTMed. And a lot of people have downloaded and set up their very own SharePoint 2010 development rigs. And quite a few people have written blogs about setting up good development environments, there is even an MSDN article on it. Two of the blogs worth noting are from MVPs Sahil Malik and Wictor Wilén. Make sure that you check these out as well. Part of the bad side-effects of being a geek is the need to do the technical stuff the best way possible (pragmatic or otherwise), but the problem with this is that what is considered “best” is relative. Precisely the reason why you are reading this post now. Most of the posts that I read are out dated/need updations or are using the wrong OS’es or virtualization solutions (again, opinions vary) or using them the wrong way. Here’s a developer’s view of Building the Ultimate SharePoint 2010 Development Rig. If you are a sales guy, it’s time to close this window. Confusion 1: Which Host Operating System and Virtualization Solution to use? This point has been beaten to death in numerous blog posts in the past, if you have time to invest, read this excellent post by our very own SharePoint Joel on this subject. But if you are planning to build the Ultimate Development Rig, then Windows Server 2008 R2 with Hyper-V is the option that you should be looking at. I have been using this as my primary OS for about 6-7 months now, and I haven’t had any Driver issue or Application compatibility issue. In my experience all the Windows 7 drivers work fine with WIN2008 R2 also. You can enable Aero for eye candy (and the Windows 7 look and feel) and except for a few things like the Hibernation support (which a can be enabled if you really want it), Windows Server 2008 R2, is the best Workstation OS that I have used till date. But frankly the answer to this question of which OS to use depends primarily on one question - Are you willing to change your primary OS? If the answer to that is ‘Yes’, then Windows 2008 R2 with Hyper-V is the best option, if not look at vmWare or VirtualBox, both are equally good. Those who are familiar with a Virtual PC background might prefer Sun VirtualBox. Besides, these provide support for running 64 bit guest machines on 32 bit hosts if the underlying hardware is truly 64 bit. See my earlier post on this. Since we are going to make the ultimate rig, we will use Windows Server 2008 R2 with Hyper-V, for reasons mentioned above. Confusion 2: Should I use a multi-(virtual) server set up? A lot of people use multiple servers for their development environments - like Wictor Wilén is suggesting - one server hosting the Active directory, one hosting SharePoint Server and another one for SQL Server. True, this mimics the production environment the best possible way, but as somebody who has fallen for this set up earlier, I can tell you that you don’t really get anything by doing this. Microsoft has done well to ensure that if you can do it on one machine, you can do it in a farm environment as well. Besides, when you run multiple Server class machine instances in parallel, there are a lot of unwanted processor cycles wasted for no good use. In my personal experience, as somebody who needs to switch between MOSS 2007/SharePoint 2010 environments from time to time, the best possible solution is to Make the host Windows Server 2008 R2 machine your Domain Controller (AD Server) Make all your Virtual Guest OS’es join this domain. Have each Individual Guest OS Image have it’s own local SQL Server instance. The advantages are that you can reuse the users and groups in each of the Guest operating systems, you can manage the users in one place, AD is light weight and doesn't take too much resources on your host machine and also having separate SQL instances for each of the Development images gives you maximum flexibility in terms of configuration, for example your SharePoint rigs can have simpler DB configurations, compared to your MS BI blast pits. Confusion 3: Which Operating System should I use to run SharePoint 2010 Now that’s a no brainer. Use Windows 2008 R2 as your Guest OS. When you are building the ultimate rig, why compromise? If you are planning to run Windows Server 2008 as your Guest OS, there are a few patches that you need to install at different times during the installation, for that follow the steps mentioned here Okay now that we have made our choices, let’s get to the interesting part of building the rig, Step 1: Prepare the host machine – Install Windows Server 2008 R2 Install Windows Server 2008 R2 on your best Desktop/Laptop. If you have read this far, I am quite sure that you are somebody who can install an OS on your own, so go ahead and do that. Make sure that you run the compatibility wizard before you go ahead and nuke your current OS. There are plenty of blogs telling you how to make a good Windows 2008 R2 Workstation that feels and behaves like a Windows 7 machine, follow one and once you are done, head to Step 2. Step 2: Configure the host machine as a Domain Controller Before we begin this, let me tell you, this step is completely optional, you don’t really need to do this, you can simply use the local users on the Guest machines instead, but if this is a much cleaner approach to manage users and groups if you run multiple guest operating systems.  This post neatly explains how to configure your Windows Server 2008 R2 host machine as a Domain Controller. Follow those simple steps and you are good to go. If you are not able to get it to work, try this. Step 3: Prepare the guest machine – Install Windows Server 2008 R2 Open Hyper-V Manager Choose to Create a new Guest Operating system Allocate at least 2 GB of Memory to the Guest OS Choose the Windows 2008 R2 Installation Media Start the Virtual Machine to commence installation. Once the Installation is done, Activate the OS. Step 4: Make the Guest operating systems Join the Domain This step is quite simple, just follow these steps below, Fire up Hyper-V Manager, open your Guest OS Click on Start, and Right click on ‘Computer’ and choose ‘Properties’ On the window that pops-up, click on ‘Change Settings’ On the ‘System Properties’ Window that comes up, Click on the ‘Change’ button Now a window named ‘Computer Name/Domain Changes’ opens up, In the text box titled Domain, type in the Domain name from Step 2. Click Ok and windows will show you the welcome to domain message and ask you to restart the machine, click OK to restart. If the addition to domain fails, that means that you have not set up networking in Hyper-V for the Guest OS to communicate with the Host. To enable it, follow the steps I had mentioned in this post earlier. Step 5: Install SQL Server 2008 R2 on the Guest Machine SQL Server 2008 R2 gets installed with out hassle on Windows Server 2008 R2. SQL Server 2008 needs SP2 to work properly on WIN2008 R2. Also SQL Server 2008 R2 allows you to directly add PowerPivot support to SharePoint. Choose to install in SharePoint Integrated Mode in Reporting Server Configuration. Step 6: Install KB971831 and SharePoint 2010 Pre-requisites Now install the WCF Hotfix for Microsoft Windows (KB971831) from this location, and SharePoint 2010 Pre-requisites from the SP2010 Installation media. Step 7: Install and Configure SharePoint 2010 Install SharePoint 2010 from the installation media, after the installation is complete, you are prompted to start the SharePoint Products and Technologies Configuration Wizard. If you are using a local instance of Microsoft SQL Server 2008, install the Microsoft SQL Server 2008 KB 970315 x64 before starting the wizard. If your development environment uses a remote instance of Microsoft SQL Server 2008 or if it has a pre-existing installation of Microsoft SQL Server 2008 on which KB 970315 x64 has already been applied, this step is not necessary. With the wizard open, do the following: Install SQL Server 2008 KB 970315 x64. After the Microsoft SQL Server 2008 KB 970315 x64 installation is finished, complete the wizard. Alternatively, you can choose not to run the wizard by clearing the SharePoint Products and Technologies Configuration Wizard check box and closing the completed installation dialog box. Install SQL Server 2008 KB 970315 x64, and then manually start the SharePoint Products and Technologies Configuration Wizard by opening a Command Prompt window and executing the following command: C:\Program Files\Common Files\Microsoft Shared Debug\Web Server Extensions\14\BIN\psconfigui.exe The SharePoint Products and Technologies Configuration Wizard may fail if you are using a computer that is joined to a domain but that is not connected to a domain controller. Step 8: Install Visual Studio 2010 and SharePoint 2010 SDK Install Visual Studio 2010 Download and Install the Microsoft SharePoint 2010 SDK Step 9: Install PowerPivot for SharePoint and Configure Reporting Services Pop-In the SQLServer 2008 R2 installation media once again and install PowerPivot for SharePoint. This will get added as another instance named POWERPIVOT. Configure Reporting Services by following the steps mentioned here, if you need to get down to the details on how the integration between SharePoint 2010 and SQL Server 2008 R2 works, see Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010 an excellent article by Alan Le Marquand Step 10: Download and Install Sample Databases for Microsoft SQL Server 2008R2 SharePoint 2010 comes with a lot of cool stuff like PerformancePoint Services and BCS, if you need to try these out, you need to have data in your databases. So if you want to save yourself the trouble of creating sample data for your PerformancePoint and BCS experiments, download and install Sample Databases for Microsoft SQL Server 2008R2 from CodePlex. And you are done! Fire up your Visual Studio 2010 and Start Coding away!!

    Read the article

  • Copy Only Backups for Adhoc Backups

    Introduction In most organizations backup plans are implemented using full differential and transactional log backups. The normal scenario would be take a full backup on Sunday (off peak hours), differential backup daily at mid-night and transactional log backups on hourly basis. What ... [Read Full Article]

    Read the article

  • Reporting Services Disaster Recovery

    Dave Lumley presents a Reporting services disaster recovery solution for SQL Server Standard Edition, using 2 servers. Worth the read if you don't run Enterprise. SQL Backup Pro wins Gold Community Choice AwardFind out why the SQL Server Community voted SQL Backup Pro 'Best Backup and Recovery Product 2012'. Get faster, smaller, fully verified backups. Download a free trial now.

    Read the article

  • Showplan Operator of the Week - Merge Interval

    When Fabiano agreed to undertake the epic task of describing each showplan operator, none of us quite predicted the interesting ways that the series helps to understand how the query optimizer works. With the Merge Interval, Fabiano comes up with some insights about the way that the Query optimizer handles overlapping ranges efficiently. Free trial of SQL Backup™“SQL Backup was able to cut down my backup time significantly AND achieved a 90% compression at the same time!” Joe Cheng. Download a free trial now.

    Read the article

  • How to Communicate Between SSBS Applications Across Instances

    Arshad Ali demonstrates how to verify the SQL Server Service Broker (SSBS) configuration when both the Initiator and Target are in different SQL Server instances, how to communicate between them and how to monitor the conversation. Free trial of SQL Backup™“SQL Backup was able to cut down my backup time significantly AND achieved a 90% compression at the same time!” Joe Cheng. Download a free trial now.

    Read the article

  • Row Oriented Security Using Triggers

    Handling security in an application can be a bit cumbersome. New author R Glen Cooper brings us a database design technique from the real world that can help you. Free trial of SQL Backup™“SQL Backup was able to cut down my backup time significantly AND achieved a 90% compression at the same time!” Joe Cheng. Download a free trial now.

    Read the article

  • Efficient, partial, point-in-time database restores

    - by GavinPayneUK
    This article is about a situation that many of us could describe the theoretical approach to solving, but then struggle to understand why SQL Server wasn’t following that theoretical approach when you tried it for real. Earlier this week, I had a client ask about the best way to perform: a partial database restore, 1 of 1300 filegroups; to a specific point in time; using a differential backup, and therefore; without restoring each transaction log backup taken since the full backup. The last point...(read more)

    Read the article

  • Streamlining granular recovery for SharePoint, with Red Gate and Metalogix

    We have recently found an elegant way to reduce the time, and disk space required for SharePoint administrators who need to perform granular recovery operations out of their SQL Server backup files. I used to get customer calls that would go something like this: Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

  • ShowPlan Operator of the Week - Merge Join

    Did you ever wonder how and why your indexes affect the performances of joins? Once you've read Fabiano Amorim's unforgettable explanation, you'll learn to love the MERGE operator, and plan your indexes so as to allow the Query Optimiser to use it. Free trial of SQL Backup™“SQL Backup was able to cut down my backup time significantly AND achieved a 90% compression at the same time!” Joe Cheng. Download a free trial now.

    Read the article

  • How to setup Ubuntu as a backup server for my Windows computer?

    - by derek
    I took my old computer (AMD 5600+, 2GB RAM, 880gt, 250GB SATA, etc) and decided to jump and try Ubuntu for the first time. I have very very little knowledge (Red Hat when I was younger) with Linux in general. My main desktop is Windows 7 and my plan is to use the Ubuntu computer as a file server sort of speak. I want to be able to setup back up schedules from my Windows PC to the Ubuntu PC (plan on leaving this on 24/7 to double as a Seedbox) How do I go on to doing this?

    Read the article

  • Is it legal or good idea to have a backup of all client sites on my own server

    - by mario
    I have seen many times that if we build a website for a client then there is a possibility that this site gets changed over a period of time. I was thinking that from now onwards whichever site I make I will host a copy of the site on a personal server. Like client1.myserver.com so that even if they change it I have the copy of it. So that if I need to show someone or I need to refer myself few things I have the proof there. I will not make them public but will password protect it. I want to know whether this is legal and a good idea or not.

    Read the article

  • Database Deployment: The Bits - Copying Data Out

    Occasionally, when deploying a database, you need to copy data out to file from all the tables in a database. Phil Factor shows how to do it, and illustrates its use by copying an entire database from one server to another. SQL Backup Pro wins Gold Community Choice AwardFind out why the SQL Server Community voted SQL Backup Pro 'Best Backup and Recovery Product 2012'. Get faster, smaller, fully verified backups. Download a free trial now.

    Read the article

  • Parent Package Variable Configuration and Logging

    In SSIS, when we use 'parent package variable configuration' the order of event execution is quite different than the normal execution order. In this article we will see what the impact on execution order when we use 'parent package variable configuration'. SQL Backup Pro wins Gold Community Choice AwardFind out why the SQL Server Community voted SQL Backup Pro 'Best Backup and Recovery Product 2012'. Get faster, smaller, fully verified backups. Download a free trial now.

    Read the article

  • Database Deployment: The Bits - Getting Data In

    Quite often, the database developer or tester is faced with having to load data into a newly created database. What could be simpler? Quite a lot of things, it seems. SQL Backup Pro wins Gold Community Choice AwardFind out why the SQL Server Community voted SQL Backup Pro 'Best Backup and Recovery Product 2012'. Get faster, smaller, fully verified backups. Download a free trial now.

    Read the article

  • How to use SQL file streaming win32 API and support WCF streaming

    - by Mahesh
    I'm using Sql server file stream type to store large files in the backend. I'm trying to use WCf to stream the file across to the clients. I'm able to get the handle to the file using SQLFileStream (API). I then try to return this stream. I have implemenetd data chunking on the client side to retrive the data from the stream. I'm able to do it for regular filestream and memory stream. Also if i convert then sqlfilestream in to memorystream that also works. The only think that doesn't work is when I try to return sqlfilestream. What am I doing wrong. I have tried both nettcpbinding with streaming enabled and http binding with MTOM encoding. This is the error message am getting : Socket connection was aborted. This could be caused by an error processing your mesage or a receive timeout being exceeded by the remote host, or an underlying network issue.. Local socket timneout was 00:09:59.... Here is my sample code RemoteFileInfo info = new RemoteFileInfo(); info.FileName = "SampleXMLFileService.xml"; string pathName = DataAccess.GetDataSnapshotPath("DataSnapshot1"); SqlConnection connection = DataAccess.GetConnection(); SqlTransaction sqlTransaction = connection.BeginTransaction("SQLSileStreamingTrans"); SqlCommand command = new SqlCommand(); command.Connection = connection; command.Transaction = sqlTransaction; command.CommandText = "SELECT GET_FILESTREAM_TRANSACTION_CONTEXT()"; byte[] transcationContext = command.ExecuteScalar() as byte[]; SqlFileStream stream = new SqlFileStream(pathName, transcationContext, FileAccess.Read); // byte[] bytes = new byte[stream.Length]; // stream.Read(bytes, 0, (int) stream.Length); // Stream reeturnStream = stream; // MemoryStream memoryStream = new MemoryStream(bytes); info.FileByteStream = stream; info.Length = info.FileByteStream.Length; connection.Close(); return info; [MessageContract] public class RemoteFileInfo : IDisposable { [MessageHeader(MustUnderstand = true)] public string FileName; [MessageHeader(MustUnderstand = true)] public long Length; [MessageBodyMember(Order = 1)] public System.IO.Stream FileByteStream; public void Dispose() { if (FileByteStream != null) { FileByteStream.Close(); FileByteStream = null; } } } ANy help is appreciated

    Read the article

  • ReportBuilder.application fails on my PC - but works on localhost

    - by JayTee
    We're running SQL 2005 on Win2K3 server and are using SSRS. Here's the situation: I can run Report Builder from localhost My coworker can run Report Builder on his Vista computer Another coworker can run Report Builder on his XP SP3 computer (IE7) I can NOT run Report Builder on my XP SP3 computer (IE7) I'm told that it could be anything from an errant registry entry to a group policy problem. Here is what I've tried: Put the site into "Trusted Sites" with "low" security re-install .NET create a new local user account and attempt to run it The results? Every single time, I get a dialog box: "Application cannot be started. Contact the application vendor" I click the details button and get this: PLATFORM VERSION INFO Windows : 5.1.2600.196608 (Win32NT) Common Language Runtime : 2.0.50727.3607 System.Deployment.dll : 2.0.50727.3053 (netfxsp.050727-3000) mscorwks.dll : 2.0.50727.3607 (GDR.050727-3600) dfdll.dll : 2.0.50727.3053 (netfxsp.050727-3000) dfshim.dll : 2.0.50727.3053 (netfxsp.050727-3000) SOURCES Deployment url : http://www.example.com/ReportServer/ReportBuilder/ReportBuilder.application Server : Microsoft-IIS/6.0 X-Powered-By : ASP.NET X-AspNet-Version: 2.0.50727 IDENTITIES Deployment Identity : ReportBuilder.application, Version=9.0.3042.0, Culture=neutral, PublicKeyToken=c3bce3770c238a49, processorArchitecture=msil APPLICATION SUMMARY * Online only application. * Trust url parameter is set. ERROR SUMMARY Below is a summary of the errors, details of these errors are listed later in the log. * Activation of http://www.example.com/ReportServer/ReportBuilder/ReportBuilder.application resulted in exception. Following failure messages were detected: + Value does not fall within the expected range. COMPONENT STORE TRANSACTION FAILURE SUMMARY No transaction error was detected. WARNINGS There were no warnings during this operation. OPERATION PROGRESS STATUS * [4/7/2010 2:53:57 PM] : Activation of http://www.example.com/ReportServer/ReportBuilder/ReportBuilder.application has started. * [4/7/2010 2:53:58 PM] : Processing of deployment manifest has successfully completed. ERROR DETAILS Following errors were detected during this operation. * [4/7/2010 2:53:58 PM] System.ArgumentException - Value does not fall within the expected range. - Source: System.Deployment - Stack trace: at System.Deployment.Application.NativeMethods.CorLaunchApplication(UInt32 hostType, String applicationFullName, Int32 manifestPathsCount, String[] manifestPaths, Int32 activationDataCount, String[] activationData, PROCESS_INFORMATION processInformation) at System.Deployment.Application.ComponentStore.ActivateApplication(DefinitionAppId appId, String activationParameter, Boolean useActivationParameter) at System.Deployment.Application.SubscriptionStore.ActivateApplication(DefinitionAppId appId, String activationParameter, Boolean useActivationParameter) at System.Deployment.Application.ApplicationActivator.Activate(DefinitionAppId appId, AssemblyManifest appManifest, String activationParameter, Boolean useActivationParameter) at System.Deployment.Application.ApplicationActivator.PerformDeploymentActivation(Uri activationUri, Boolean isShortcut, String textualSubId, String deploymentProviderUrlFromExtension, BrowserSettings browserSettings, String& errorPageUrl) at System.Deployment.Application.ApplicationActivator.ActivateDeploymentWorker(Object state) COMPONENT STORE TRANSACTION DETAILS * Transaction at [4/7/2010 2:53:58 PM] + System.Deployment.Internal.Isolation.StoreOperationSetDeploymentMetadata - Status: Set - HRESULT: 0x0 + System.Deployment.Internal.Isolation.StoreTransactionOperationType (27) - HRESULT: 0x0 I'm really at a loss. I'm certain there is something on my PC preventing the application from running - but I just don't know what. Google hasn't been much of a help because most problems are related to the server configuration (which I know is correct since it works on other PCs) Help me, Overflow Kenobi, you're my only hope..

    Read the article

  • Calculate year for end date: PostgreSQL

    - by Dave Jarvis
    Background Users can pick dates as shown in the following screen shot: Any starting month/day and ending month/day combinations are valid, such as: Mar 22 to Jun 22 Dec 1 to Feb 28 The second combination is difficult (I call it the "tricky date scenario") because the year for the ending month/day is before the year for the starting month/day. That is to say, for the year 1900 (also shown selected in the screen shot above), the full dates would be: Dec 22, 1900 to Feb 28, 1901 Dec 22, 1901 to Feb 28, 1902 ... Dec 22, 2007 to Feb 28, 2008 Dec 22, 2008 to Feb 28, 2009 Problem Writing a SQL statement that selects values from a table with dates that fall between the start month/day and end month/day, regardless of how the start and end days are selected. In other words, this is a year wrapping problem. Inputs The query receives as parameters: Year1, Year2: The full range of years, independent of month/day combination. Month1, Day1: The starting day within the year to gather data. Month2, Day2: The ending day within the year (or the next year) to gather data. Previous Attempt Consider the following MySQL code (that worked): end_year = start_year + greatest( -1 * sign( datediff( date( concat_ws('-', year, end_month, end_day ) ), date( concat_ws('-', year, start_month, start_day ) ) ) ), 0 ) How it works, with respect to the tricky date scenario: Create two dates in the current year. The first date is Dec 22, 1900 and the second date is Feb 28, 1900. Count the difference, in days, between the two dates. If the result is negative, it means the year for the second date must be incremented by 1. In this case: Add 1 to the current year. Create a new end date: Feb 28, 1901. Check to see if the date range for the data falls between the start and calculated end date. If the result is positive, the dates have been provided in chronological order and nothing special needs to be done. This worked in MySQL because the difference in dates would be positive or negative. In PostgreSQL, the equivalent functionality always returns a positive number, regardless of their relative chronological order. Question How should the following (broken) code be rewritten for PostgreSQL to take into consideration the relative chronological order of the starting and ending month/day pairs (with respect to an annual temporal displacement)? SELECT m.amount FROM measurement m WHERE (extract(MONTH FROM m.taken) >= month1 AND extract(DAY FROM m.taken) >= day1) AND (extract(MONTH FROM m.taken) <= month2 AND extract(DAY FROM m.taken) <= day2) Any thoughts, comments, or questions? (The dates are pre-parsed into MM/DD format in PHP. My preference is for a pure PostgreSQL solution, but I am open to suggestions on what might make the problem simpler using PHP.) Versions PostgreSQL 8.4.4 and PHP 5.2.10

    Read the article

  • LINQ InsertOnSubmit Required Fields needed for debugging

    - by Derek Hunziker
    Hi All, I've been using the ADO.NET Strogly-Typed DataSet model for about 2 years now for handling CRUD and stored procedure executions. This past year I built my first MVC app and I really enjoyed the ease and flexibility of LINQ. Perhaps the biggest selling point for me was that with LINQ I didn't have to create "Insert" stored procedures that would return the SCOPE_IDENTITY anymore (The auto-generated insert statements in the DataSet model were not capable of this without modification). Currently, I'm using LINQ with ASP.NET 3.5 WebForms. My inserts are looking like this: ProductsDataContext dc = new ProductsDataContext(); product p = new product { Title = "New Product", Price = 59.99, Archived = false }; dc.products.InsertOnSubmit(p); dc.SubmitChanges(); int productId = p.Id; So, this product example is pretty basic, right, and in the future, I'll probably be adding more fields to the database such as "InStock", "Quantity", etc... The way I understand it, I will need to add those fields to the database table and then delete and re-add the tables to the LINQ to SQL Class design view in order to refresh the DataContext. Does that sound right? The problem is that any new fields that are non-null are NOT caught by the ASP.NET build processes. For example, if I added a non-null field of "Quantity" to the database, the code above would still build. In the DataSet model, the stored procedure method would accept a certain amount of parameters and would warn me that my Insert would fail if I didn't include a quantity value. The same goes for LINQ stored procedure methods, however, to my knowledge, LINQ doesn't offer a way to auto generate the insert statements and that means I'm back to where I started. The bottom line is if I used insert statements like the one above and I add a non-null field to my database, it would break my app in about 10-20 places and there would be no way for me to detect it. Is my only option to do a solution-side search for the keyword "products.InsertOnSubmit" and make sure the new field is getting assigned? Is there a better way? Thanks!

    Read the article

  • Access Database using do.cmd openform where clasue - returning all values

    - by primus285
    DoCmd.OpenForm "Database Search", acFormDS, , srcLastName & "AND " & srcFirstName This is only a small sample of the where clause - there are many more terms. First, there is a set of If, Then type tings up top that set the variable srcLastName and srcFirstName to some value. These are not the problem and work just fine. The trouble is getting them to return all values (for instance if you only want to search by one, on neither(return full database list)) Thus far I have settled for (in the if then section): srcLastName = "[Lastname] =" & Chr(34) & cboLastName & Chr(34) - to search for something and srcLastName = "[Lastname] <" & Chr(34) & "Nuthin" & Chr(34) - to return everything (not equal to an absurd and mispelled database term.) The trouble is that data that is null is also not returned. If I have a null firstname, it will not show up in any search period. is there a term I can set [lastname] and [firstname] equal to that will return EVERYTHING (null, open, data, numbers, wierd stuff and otherwise) in a search an SQL form of "give me everything shes got scotty" if you will. the real issue here comes from the data entry - if I could just know that the people would enter everything 100% of the time, this code would work. but forget to enter the persons age or whatever, and it wont return that entry. So far, the only other solution I have come up with is to put a counter in each if then statement. The count will go up by one for each thing that is being searched by. Then if the count is = 1, then I can search by something like just DoCmd.OpenForm "Database Search", acFormDS, , srcLastName or DoCmd.OpenForm "Database Search", acFormDS, , srcFirstName then revert back to the DoCmd.OpenForm "Database Search", acFormDS, , srcLastName & "AND " & srcFirstName when the count is 2 or more truoble here is that it only works for one (unless I so wanted to create a custon list of 2 combined, 3 combined, 4 combined, but that is not happening)

    Read the article

< Previous Page | 626 627 628 629 630 631 632 633 634 635 636 637  | Next Page >