Search Results

Search found 99285 results on 3972 pages for 'server load'.

Page 82/3972 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • Sql Server Compact 2005 on Visual Studio 2008

    - by Tim
    I'm working on a Windows Forms application that interacts with a Sql Compact database file created by SQL Server 2005. This application was originally developed in Visual Studio 2005 but was recently converted to a Visual Studio 2008 solution. In regards to Sql Compact, we made sure the references were all still set to the assemblies that handle the 2005 version of Sql Compact rather than Sql Compact 3.5. Having done this, the application still runs just as it should - it will still interact with the Compact database, perform synchronization operations, etc. However, I just discovered today that Visual Studio tools such as the DataSet Designer do not play well with a Sql Compact database file of an older version than 3.5. If I go to the New Connection... wizard, the only Sql Compact Data Source / Data Provider are for Sql Compact 3.5. I assume that Visual Studio 2008 just doesn't include the data provider for the older version of Sql Compact by default. Is there a way you can add the old version of Sql Compact to the list of "Data Sources" for the connection wizard? To see exactly what I'm referring to, click on the Tools menu of Visual Studio 2008 and click Connect to Database... In the window that comes up, click Change... next to the Data source setting. From this dialog there is no way I can select the earlier version of Sql Compact - only 3.5 is available. Maybe I need to add an assembly reference somewhere? Or copy some file(s) from my Visual Studio 2005 directory over to 2008? I would think there would have to be a way for Visual Studio 2008 to be able to interact with a Sql Compact database from Sql Server 2005. To provide one more bit of detail, I discovered this problem when I went to my DataSet, right-clicked and tried to add a TableAdapter. The first screen that comes up says, "Choose Your Data Connection". If I leave it set to the Sql Compact connection that we've always used, I now get the following error when clicking the Next button: Failed to open a connection to the database "The selected database was created with an earlier version of SQL Server Compact and needs to be upgraded to SQL Server Compact 3.5 before the connection can be opened or tested. Upgrade the database by creating a new data connection and completing the Add Connection dialog box." Check the connection and try again. The only problem here is that we still use Sql Server 2005, and if my understanding is correct, it does not produce subscription files that are compatible with Sql Compact 3.5. If I am wrong in this assumption, please correct me. Any help you can provide is greatly appreciated. Thank you.

    Read the article

  • How to remove "Server name" items from history of SQL Server Management Studio

    - by arsenalogy
    When trying to connect to a server in Management Studio (specifically 2008), there is a field where you enter the Server name. That field also has a drop-down list where it shows a history of servers that you have attempted to connect to. I would like to know: How to remove an individual item from that history. How to remove an item from the Login field history for each Server name. Thanks!

    Read the article

  • SQL Agent Command Line Not Saved

    - by Greg
    I have a SSIS package I am trying to schedule. I create a new job under SQL Server Agent. On the Command line tab of the jobstep, I choose "Edit the command-line manually". The changes are retained as I switch from tab to tab within the job step but whenever I exit and save the job, the changes are lost. Any ideas what's going on? I'm on SQL Server 2008.

    Read the article

  • Slow (to none) performance on SQL 2005 after attaching SQL 2000 database

    - by ploft
    Issue: Using the detach/attach SQL database from a SQL 2000 SP4 instance to a much beefier SQL 2005 SP2 server. Run reindex, reorganize and update statistics a couple of times, but without any success. Queries on SQL 2000 took about 1-2 sec. to complete, now the same queries take 2-3 min on the SQL 2005 (and even 2008 - tested it there also). Have looked at the execution plans and the overall percent matches or are alike on each server.

    Read the article

  • DB2 load partitioned data in parallel

    - by Erik Paulson
    I have a 10-node DB2 9.5 database, with raw data on each machine (ie node1:/scratch/data/dataset.1 node2:/scratch/data/dataset.2 ... node10:/scratch/data/dataset.10 There is no shared NFS mount - none of my machines could handle all of the datasets combined. each line of a dataset file is a long string of text, column delimited. The first column is the key. I don't know the hash function that DB2 will use, so dataset is not pre-partitioned. Short of renaming all of my files, is there any way to get DB2 to load them all in parallel? I'm trying to do something like load from '/scratch/data/dataset' of del modified by coldel| fastparse messages /dev/null replace into TESTDB.data_table part_file_location '/scratch/data'; but I have no idea how to suggest to db2 that it should look for dataset.1 on the first node, etc.

    Read the article

  • MySQL: Blank row in table after LOAD DATA INFILE

    - by Tom
    Hi, I'm uploading a large amount of data from a CSV (I'm doing it via MySQL Workbench): LOAD DATA INFILE 'C:/development/mydoc.csv' INTO TABLE mydatabase.mytable CHARACTER SET utf8 FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r'; However, I'm noticing that it keeps adding an empty line full of nulls/zeros after the last record. I'm guessing it's because of the "LINES TERMINATED" command. However, I need that to load the data in correctly. Is there some way around this / some better SQL to avoid the blank row in the table? Thanks

    Read the article

  • LOAD DATA INFILE not working in mariadb

    - by Haseena
    Iam trying to migrate from mysql to mariadb. On this time I can face an issue with mariadb. When I can trying to load a data file into a table, it shows an error like : SQL Error (29): File 'C:/Documents and Settings/Administrator/Local Settings/Temp/SAMPLE/DATA_TEMP1351761841668/SampleFile0' not found (Errcode: 2) But the file already exists in the path.... Another one point is that the same command successfully works with MySQL. Is MariaDB has any permission issue? Login as Administrator. See below my query : load data infile "'C:/Documents and Settings/Administrator/Local Settings/Temp/SAMPLE/DATA_TEMP1351761841668/SampleFile0" into table SAMPLETABLE; When changing the path loke "C:/SampleFile0", its working properly. From Administrator folder it doesn't working. Can anyone help me in this regard??? Iam a newone in MariaDB.

    Read the article

  • Installation doesn't recognize SQL server 2008 saying Server components not found

    - by Aarthi
    Please answer these 2 questions: Our installation is not recognising the SQL server 2008 and if we try to create a DB, its telling Server components not found. Please tell what may be the issue and why it's not recognising. But our installation proceeds as client installation(if i point to the same instance with DB already created) without any issues. Can we install SQL express 2005 in a machine where SQL server 2008 is already installed. Please clarify.

    Read the article

  • Unattended Install of SQL Server 2005 Express with LOCAL Server InstanceName

    - by Jeff
    I'm creating an install package using InnoSetup and installing SQL Server 2005 Express. Here's the code below that appears in my RUN section: Filename: "{app}\SQL Server 2005 Express\SQLEXPR.exe" ; Parameters: "-q /norebootchk /qn reboot=ReallySuppress addlocal=all INSTANCENAME=(LOCAL) SCCCHECKLEVEL=IncompatibleComponents:1;MDAC25Version:0 ERRORREPORTING=2 SQLAUTOSTART=1 SAPWD=passwordhere SECURITYMODE=SQL"; WorkingDir: {app}\SQL Server 2005 Express; StatusMsg: Installing Microsoft SQL Server 2005 Express... Please Wait...;Check:SQLVerifyInstall What I'm trying to accomplish is have the SQL Server package install but only have the instance name itself reference the name of the machine name and nothing more. What I'm receiving instead is a named instance instead of local such as MachineName\SQLEXPRESS which is not what I want to receive. I need a local instance instead of a named instance due to the way my code is written to be able to install and talk with the databases in question. I would change it, trust me, were it not the fact that this install package is a replacement to a previous package that used the MSDE installer. I have to be able to support both through code. Any suggestions are welcome but a clear and concise method to get the installer to quietly install using only the machine name is my main goal. Thanks for the help and support!

    Read the article

  • SQL Server 2008 to SQL Server 2005

    - by Sakhawat Ali
    I have an MDF and LDF file of SQL Server 2005. i attached it with SQL Server 2008 and did some change in data. now when i attached it back to sql server 2005 Express Edition it gives version error. The database 'E:\DB\JOBPERS.MDF' cannot be opened because it is version 655. This server supports version 612 and earlier. A downgrade path is not supported. Could not open new database 'E:\DB\JOBPERS.MDF'. CREATE DATABASE is aborted. An attempt to attach an auto-named database for file E:\DB\Jobpers.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.

    Read the article

  • Load balancing and shedulling algorithms .NET

    - by Lukas Šalkauskas
    Hello there, so here is my problem: I have several different configuarion servers. I have different calculations (jobs), I can predict how long approx. each job will take to be caclulated. Also I have priorities. My question is how to keep all machines loaded 99-100% and shedule the jobs in the best way. Each machine can do several calculations at the time. Jobs are pushed to the machine. Central machine knows current load of each machine. Also I would like to to assign some king of machine learning here, because I will know statistics of each job (started, finished, cpu load etc.). How to distribute jobs(calculations) in the best possible way, also keep in mind priority. Any suggestions ? Ideas ? Algorithms ?

    Read the article

  • Uploading files to varbinary(max) in SQL Server -- works on one server, not the other

    - by pjabbott
    I have some code that allows users to upload file attachments into a varbinary(max) column in SQL Server from their web browser. It has been working perfectly fine for almost two years, but all of a sudden it stopped working. And it stopped working on only the production database server -- it still works fine on the development server. I can only conclude that the code is fine and there is something up with the instance of SQL Server itself. But I have no idea how to isolate the problem. I insert a record into the ATTACHMENT table, only inserting non-binary data like the title and the content type, and then chunk-upload the uploaded file using the following code: // get the file stream System.IO.Stream fileStream = postedFile.InputStream; // make an upload buffer byte[] fileBuffer; fileBuffer = new byte[1024]; // make an update command SqlCommand fileUpdateCommand = new SqlCommand("update ATTACHMENT set ATTACHMENT_DATA.WRITE(@Data, NULL, NULL) where ATTACHMENT_ID = @ATTACHMENT_ID", sqlConnection, sqlTransaction); fileUpdateCommand.Parameters.Add("@Data", SqlDbType.Binary); fileUpdateCommand.Parameters.AddWithValue("@ATTACHMENT_ID", newId); while (fileStream.Read(fileBuffer, 0, fileBuffer.Length) > 0) { fileUpdateCommand.Parameters["@Data"].Value = fileBuffer; fileUpdateCommand.ExecuteNonQuery(); <------ FAILS HERE } fileUpdateCommand.Dispose(); fileStream.Close(); Where it says "FAILS HERE", it sits for a while and then I get a SQL Server timeout error on the very first iteration through the loop. If I connect to the development database instead, everything works fine (it runs through the loop many, many times and the commit is successful). Both servers are identical (SQL Server 9.0.3042) and the schemas are identical as well. When I open Activity Monitor right after the timeout to see what's going it, it says the last command is (@Data binary(1024),@ATTACHMENT_ID decimal(4,0))update ATTACHMENT set ATTACHMENT_DATA.WRITE(@Data, NULL, NULL) where ATTACHMENT_ID = @ATTACHMENT_ID which I would expect but it also says it has a status of "Suspended" and a wait type of "PAGEIOLATCH_SH". I looked this up and it seems to be a bad thing but I can't find anything specific to my stuation. Ideas?

    Read the article

  • Pushing a local mercurial repository to a remote server or cloning at server from local

    - by Samaursa
    I have a local repository that I have now decided to push to a remote server (for example, I have a host that allows mercurial repositories and I am also trying to push to bitbucket). The repository has a lot of files and is a little more than 200mb. Locally, I am able to clone the repository without problems. Now I have a lot of changes in this repository, and I have wasted a couple of days trying to figure out how to get the remote server to clone my repository. I cannot get hg serve to work outside of the LAN. I have tried everything. So instead, I created a new repository at the remote servers (both at the host and bitbucket) with nothing in it. Now I am pushing the complete repository that I have locally to these remote locations. So far it has been unsuccessful, as the push operation is stuck on searching for changes and does not give me any other useful output. I have let it go for about an hour with no change. Now my questions is, what am I doing wrong as far as hg serve is concerned? I can access it locally but not remotely (through DynDns - I have configured it properly and the router forwards the ports correctly) so that I can get the server to clone the repository the first time after which I will be pushing to it. My second question is, assuming the clone at server does not work (for example, if I was to push my current repository to bitbucket), is creating an empty repository at the server and then pushing a local repository to the new remote repository ok? Is that the source of the searching for changes problem? Any help in this regard would be greatly appreciated.

    Read the article

  • double authentication issue on IIS / Report Server (SQL server 2008)

    - by Vinzz
    Hi, On a 2003 server box, with SQL server 2008 installed (ReportServer deployed in IIS mode), I've got a virtual directory within IIS with it's security set to 'windows authentication', with the following html code: <body> <h1>test</h1> <iframe src="/reportserver" witdh="50%" height="50%" /> </body> From the outside, I've got a first login/pwd box displayed to access the html code, then a second one to display the content of the iframe. On the same type of server, but with SQL Server 2005, I don't have this issue (i.e. only one login box). My thought is that the first token should give acces to both the page and the iframe, isn't it? Any hints on how to setup the reportserver to fix this? thanks.

    Read the article

  • Load balancing and scheduling algorithms.

    - by Lukas Šalkauskas
    Hello there, so here is my problem: I have several different configuarion servers. I have different calculations (jobs); I can predict how long approximately each job will take to be caclulated. Also, I have priorities. My question is how to keep all machines loaded 99-100% and schedule the jobs in the best way. Each machine can do several calculations at a time. Jobs are pushed to the machine. The central machine knows the current load of each machine. Also, I would like to to assign some kind of machine learning here, because I will know statistics of each job (started, finished, cpu load etc.). How can I distribute jobs (calculations) in the best possible way, keeping in mind the priorities? Any suggestions, ideas, or algorithms ? FYI: My platform .NET.

    Read the article

  • jquery .load taking too long after successive calls

    - by user560079
    I have the following jquery script: <script> $(function(){ $('#menu-change-div').on('click', 'a.change-content', function(e){ e.preventDefault() $("#content").load($(this).attr("href")); }); }); </script> It dynamically loads content into my content div depending on which link they clicked in the menu. The problem I am having is when I click multiple links, say 5-10 in a row, the load time goes from instantly to taking 10 seconds or more to not even loading. Is there something in my function that is causing this? Thanks

    Read the article

  • Image container instead of event object in image load event handler

    - by avok00
    I stumbled upon a very strange thing. In FF 3.6 (not tested others yet) I add onload handler to an image like this: imgRef.addEventListener("load", activateLink, false); When load event fires, in activateLink(evt) the evt paramater is not an event, but the "a" tag that contains the image. Why is this? function activateLink(evt) { // evt turns out to be a refference to <a> tag (HTMLAnchorElement) that contains the image. // Actually two of them. Both dynamically added with addElement. } I remembered another fact that may be relevant. I have multiple images with the same src that all have registered this same event handler activateLink. Could this be the problem?

    Read the article

  • jQuery: load refuses to get dynamic content in IE6

    - by user260157
    jQuery refuses to load my dynamic content in IE6. All in FireFox & Safari works fine. Only IE6 is being a pain. When I try the a html with <p>Hello World</p> that works. Properly. But when loading a PHP it doesn't work! As you can see it's doing multiple things. <script type="text/javascript"> // When the document is ready set up our sortable with it's inherant function(s) $(document).ready(function() { // Sort list & amend in database function sortTableMenuAndReload() { var order = $('#menuList').sortable('serialize'); $.post("PLUGINS/SortableMenu/process-sortable.php",order); $("#menuList").load("PLUGINS/SortableMenu/sortableMenu_ajax.php"); } function sortTableOrder() { var order = $('#menuList').sortable('serialize'); $.post("PLUGINS/SortableMenu/process-sortable.php",order); } function sortTableOrderAndRemove(removeID) { $('#listItem_'+removeID).remove(); var order = $('#menuList').sortable('serialize'); $.post("PLUGINS/SortableMenu/process-sortable.php",order); $("#menuList").load("PLUGINS/SortableMenu/sortableMenu_ajax.php"); } $("#menuList > li > .remove").live('click', function () { var removeID = $(this).attr('id'); $.ajax({ type: 'post', url: 'PLUGINS/SortableMenu/removeLine.php', data: 'id='+removeID, success: sortTableOrderAndRemove(removeID) }); }); $("#menuList > li > .publish").live('click', function () { var publishID = $(this).attr('id'); $.ajax({ type: 'post', url: 'PLUGINS/SortableMenu/publishLine.php', data: 'id='+publishID, success: sortTableOrder }); }); $('#new_documents > li').draggable({ addClasses: false, helper:'clone', connectToSortable:'#menuList' }); $("#menuList").droppable({ addClasses: false, drop: function() { var clone = $("#menuList > li#newArticleTYPE1"); $(clone).attr("id","listItem_newArticleTYPE1"); } }); $("#menuList").sortable({ opacity: 0.6, handle : '.handle, .remove', update : sortTableMenuAndReload }); }); </script>

    Read the article

  • Live() keyword not working on load with dynamic html images

    - by Sara Chipps
    I have images being dynamically added to a page, I don't seem to be able to get the 'load' event working dynamically with live(). This is the code I currently have: $('#largeImg' + nextUniqueItemID).hide(); $('#largeImg' + nextUniqueItemID).live('load' , function() { $('#loader' + nextUniqueItemID).hide(); $('#largeImg' + nextUniqueItemID).show(); }); with '#largeImg' + nextUniqueItemID being an image that was added to the page earlier in the function and '#largeImg' + nextUniqueItemID being a loading image. I feel as if I may be misusing "live" as it doesn't really need a listener but to trigger the event immediately.

    Read the article

  • jquery load a specific #Div content from multiple html files

    - by Vikram
    Hello friends I am trying to make a Content Slider for my site. I have multiple HTML files and the structure of these files is like this: <div id="title"><h2>Title of the Slide</h2></div> <div id="image"><a href="http://mylink.com"><img src="image.jpg" width="600" height="300" alt="image"</a></div> <div id="content">Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s.</div> I have been trying to use the following script get content (but no success): <?php function render($position="") { ob_start(); foreach(glob("/slides/*.html") as $fileName) { $fname = basename( $fileName ); $curArr = file($fname); $slides[$fname ]['title'] = $curArr[0]; $slides[$fname ]['image'] = $curArr[1]; $slides[$fname ]['content'] = $curArr[2]; foreach($slides as $key => $value){ ?> <div id="slide-title"> <?php echo $value['title'] ?> </div> <div id="slide-content"> <?php echo $value['image'] ?> </div> <div id="slide-image"> <?php echo $value['content'] ?> </div> <?php }} ?> <?php return ob_get_clean(); } But then I came to know about a jQuery function.... (again no success) jQuery.noConflict(); (function($){ $(document).ready(function () { $('#slide-title').load('slides/slide1.html #title'); $('#slide-content').load('slides/slide1.html #content'); $('#slide-image').load('slides/slide1.html #image'); }); })(jQuery); Now My questions are..... Am I using the right syntax. How do I get the content from multiple files using jQuery. Please Note : My knowledge on Programming is almost '0'. I have just started learning it.

    Read the article

  • Load class based on SDK version

    - by Bostjan
    Is there any way I can load a class based on what version of the OS the phone is running? For example: I made an app which requires 1.6+ Android. Is there a way for me to load one class or the other based on what OS the phone is running? I'm asking this specifically for contacts. The database was changed from 1.6 to 2.0 and the old version doesn't retrieve contacts on the new OS phone. I'd still like to keep my 1.6 requirement, but at the same time I'd like 2.0+ phones to access the contact part of the app. So can I make 2 APIs, somehow pack them with the app and decide on the fly which I choose to import?

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >