Search Results

Search found 3546 results on 142 pages for 'dos batch'.

Page 114/142 | < Previous Page | 110 111 112 113 114 115 116 117 118 119 120 121  | Next Page >

  • Is it OK to mixed NumberLong and normal integers in the same field in MongoDB?

    - by Nicholas Tolley Cottrell
    I have been using Morphia to persistent objects from Java. I have also been running some batch processes from the console. I just realised that some values are now stored as NumberLong and number as plain Javascript numbers. I have an index on this field. Everything seems to be ok, but if I query: {f: 100} from the console it still returns the object even if it actually contains {f: NumberLong(100)} Is this true of all the drivers? It is best practice to avoid NumberLong is I can fit the value inside 32-bit? Will I save a lot of data and index space if I convert all NumberLongs to basic numbers?

    Read the article

  • Log4net rollingfile appender skipping every other day in asp.net mvc app

    - by tasty_spider_men
    Hi, I've set up some log4net logging on an asp.net mvc application i've had running for a little over a month now. I've set up rolling file (and smtp) appender on it. The application is hit several thousand times a day - alot of actions are logged. On top of these a number of batch jobs are run as part of the application that also write to the log. Now - as i've been occupied with other work, i've not attended to this application much assuming things to be working based on my test setup. Unfortunately i discover that the rolling file appender on my production server seems to be skipping every other day. Ie i have logs : 10/4/2010, 10/4/2010.1, 10/4/2010.2, 12/4/2010, 12/4/2010.1 , 14/4/2010, 14/4/2010.1 etc. Any idea what could be causing this? It is absolutely impossible that there has no action on these odd days for the lifetime of the application. Cheers

    Read the article

  • Higher speed options for executing very large (20 GB) .sql file in MySQL

    - by Jonogan
    My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's been running for 14 hours and shows no signs of being near completion. Does anyone know of any higher speed options for such a transaction? Or is this what I should expect given the large file size? Thanks

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • Need help ttoubleshooting PC

    - by brux
    I have had problems since my dog pee'd on my computer. Problem: loads windows fine, at random intervals from 5 minutes to 30 minutes it restarts itself. There is nothing in the event log such as errors, no BSOD, just cold restart. after rstarting - sometimes- it POST's and restarts itself at the end of POST. It will do this many times and then finally load windows. The cycle then begins again, it will restart eventually. What i have done: I thought it was HDD at first, since this is the only part of the coputer which actually got wet with any fluid ( the case is off the PC and the dog pee'd down the front where the HDD is located). Seatool, the seagate HDD tool, found errors when I ran it inside windows, so I ran it in DOS mode from bootable USB and ran it. It found the same number of errors and fixed them all. I ran the scan again and it says "Good". I loaded windows and ran the scan and it also said "Good there. So the HDD apears to be fine but the problem persists, random restarts. What else could this be? I have taken the computer apart and cleaned everything and also taken the PSU apart and cleaned it thoughrouly. The problem still persists, what should my next steps be? Thanks in advance.

    Read the article

  • Installed Windows 7 Ultimate on D Drive and previous Windows 7 Enterprise on C Drive has stopped starting up

    - by teenup
    Please please help! I have installed Windows 7 Ultimate on same hard drive on D Drive on my laptop and the previous Windows 7 Enterprise which was installed on C Drive is not booting up now. When I turn on my laptop, I see two Windows 7 on the screen, when I select newer one, it starts, but when I select older one which is Enterprise edition, system won't start and I get the DOS black screen with this error message: Windows Boot Manager Windows failed to start. A recent hardware or software change might be the cause. To fix the problem: Insert your Windows installation disc and restart your computer. Choose your language settings, and then click "Next." Click "repair your computer." Info: The boot selection failed because a required device is inaccessible. I notice that when I run the newer OS installed, the previous OS's drive (Which is D: now instead of C:) has become unusable and when I double click it, it asks me to format the drive. The data, that I had on my D Drive (Which is now C Drive for new OS), I had copied it to a network path and it is available. It was containing Windows 7 Users folder which I copied at that time when installing new windows. I have copied that Users folder again to the new OS's C Drive thinking it would run again, but of no use. Please please please...if someone can help...It is extremely required for me. Thanks a lot in advance.

    Read the article

  • Force Response to Download File(s) to Desktop with Ruby?

    - by viatropos
    I was thinking about making a little crop/resize batch processor online, and wanted to know if there was a way for me to do the following: upload image and specify dimensions click "process" and remote app resizes image image downloads automatically locally to wherever it was I uploaded it (say from my desktop), but with a new name (based on the time for example). This would make it so I could host a free image processor that never stored any data other than tempfiles. Is that possible? Something like Rails' send_file method, but I'm using Sinatra and am looking for something in pure ruby. What's the basic concept behind this? What if I wanted to do this for multiple files? Åssuming I can get multiple files uploaded no problem, how can I download all of them automatically?

    Read the article

  • PCI configuration method error (Linux Kernel)

    - by user326580
    (I'm not sure if here is the best place for that question, so I will be pleased if anyone suggests me a more proper forum for that.) I'm trying to install Ubuntu 12.04.4 in a netbook (from an usb), but the kernel stops very early in initialization process. After two days of research, I've found that it boots with the parameter pci=conf2 but not with the default conf1 method. Nevertheless, after kernel boot, it seems that Ubuntu can't find usb devices and I'm not be able to install it. Trying with Debian, its a graphic installer and I found that the mouse isn't working neither.I think pci devices are not working. I tried about 50% of kernel pci boot options in the kernel-parameters file (in conjunction with the implicit default conf1) without luck. Any suggestions? PS: The problem is the same with kernel 2.6 or 3. (In Spanish) No estoy seguro si éste es el mejor lugar para esta pregunta, por lo cual estaré encantado si alguno me sugiere un mejor lugar para ella. Estoy intentando instalar Ubuntu 12.04.4 en una netbook (desde un usb), pero el kernel se detiene muy temprano en la inicialización. Después de dos días de investigar, encontré que arranca con el parámetro pci=conf2 pero no con método default conf1. Sin embargo después de que el kernel arranca, parece que Ubuntu no logra encontrar los dispositivos usb y no puedo instalar el sistema. Intentando con Debian y su instalador gráfico, encontré que el ratón tampoco funcionaba, así que pienso que los dispositivos pci no están funcionando. Intenté con aproximadamente el 50% de las opciones de arranque del kernel para pci (en conjunción con el método implícito conf1) sin suerte. Alguna idea? PS: El problema es el mismo con el kernel 2.6 o 3.

    Read the article

  • Windows Phone: Updating backend datastore (via web service) while keeping UI very responsive

    - by will
    I am developing a Windows Phone app where users can update a list. Each update, delete, add etc need to be stored in a database that sits behind a web service. As well as ensuring all the operations made on the phone end up in the cloud, I need to make sure the app is really responsive and the user doesn’t feel any lag time whatsoever. What’s the best design to use here? Each check box change, each text box edit fires a new thread to contact the web service? Locally store a list of things that need to be updated then send to the server in batch every so often (what about the back button)? Am I missing another even easier implementation? Thanks in advance,

    Read the article

  • Fastest Way to generate 1,000,000+ random numbers in python

    - by Sandro
    I am currently writing an app in python that needs to generate large amount of random numbers, FAST. Currently I have a scheme going that uses numpy to generate all of the numbers in a giant batch (about ~500,000 at a time). While this seems to be faster than python's implementation. I still need it to go faster. Any ideas? I'm open to writing it in C and embedding it in the program or doing w/e it takes. Constraints on the random numbers: A Set of numbers 7 numbers that can all have different bounds: eg: [0-X1, 0-X2, 0-X3, 0-X4, 0-X5, 0-X6, 0-X7] Currently I am generating a list of 7 numbers with random values from [0-1) then multiplying by [X1..X7] A Set of 13 numbers that all add up to 1 Currently just generating 13 numbers then dividing by their sum Any ideas? Would pre calculating these numbers and storing them in a file make this faster? Thanks!

    Read the article

  • Forcing a method to be non-transactional in JPA (Eclipselink)

    - by rhinds
    Hi, I am developing an application using Eclipselink and as part of the app I need to be able to manipulate some of the objects which involves changing data without it being persisted to the database (i merging/changing objects for some batch generation processes). I am reluctant to change the data in the Entity objects, as there is a risk that even though i have not marked the methods as @Transactional, this method could in the future be inadvertantly called from within a transactional method and these changes could be persisted. So my question is, is there anyway to get around this? Such as force a method to always be non-transactional regardless; terminate any transactionality as soon as the method is started; etc. I know there is a .detach() method that can detach the objects from the Entity Manager, however, there are many objects and this seems like a potentially error prone fail-safe on my code.

    Read the article

  • Loading large amounts of data to an Oracle SQL Database

    - by James
    Hey all, I was wondering if anyone had any experience with what I am about to embark on. I have several csv files which are all around a GB or so in size and I need to load them into a an oracle database. While most of my work after loading will be read-only I will need to load updates from time to time. Basically I just need a good tool for loading several rows of data at a time up to my db. Here is what I have found so far: I could use SQL Loader t do a lot of the work I could use Bulk-Insert commands Some sort of batch insert. Using prepared statement somehow might be a good idea. I guess I was wondering what everyone thinks is the fastest way to get this insert done. Any tips?

    Read the article

  • new with exception with Microsoft

    - by wsd
    As I'm coding for both Windows and Linux, I have a whole host of problems. Microsoft Visual C++ has no stdint header, but for that I wrote my own. Now I found out that MS C++'s new operator does not throw an exception, so I want to fix this a quickly as possible. I know I can define a Macro with Parameters in Parenthesis, can I define a macro that replaces MyClass x = new MyClass(); with #ifdef MSC_VER if(!(MyClass x = new MyClass()) { throw new std::bad_alloc(); } #else MyClass x = new MyClass(); #endif (or something equivalent), AND works in BOTH MS C++ and G++ ? Or alternatively if that is not possible, a batch file to run over the Code to do this? I have become rather dependent on this exception being thrown.

    Read the article

  • Hibernate / MySQL Bulk insert problem

    - by Marty Pitt
    I'm having trouble getting Hibernate to perform a bulk insert on MySQL. I'm using Hibernate 3.3 and MySQL 5.1 At a high level, this is what's happening: @Transactional public Set<Long> doUpdate(Project project, IRepository externalSource) { List<IEntity> entities = externalSource.loadEntites(); buildEntities(entities, project); persistEntities(project); } public void persistEntities(Project project) { projectDAO.update(project); } This results in n log entries (1 for every row) as follows: Hibernate: insert into ProjectEntity (name, parent_id, path, project_id, state, type) values (?, ?, ?, ?, ?, ?) I'd like to see this get batched, so the update is more performant. It's possible that this routine could result in tens-of-thousands of rows generated, and a db trip per row is a killer. Why isn't this getting batched? (It's my understanding that batch inserts are supposed to be default where appropriate by hibernate).

    Read the article

  • SharePoint SLK and T-SQL xp_cmdshell safety

    - by Mitchell Skurnik
    I am looking into a TSQL command called "xp_cmdshell" to use to monitor a change to a the SLK (SharePoint Learning Kit) database and then execute a batch or PowerShell script that will trigger some events that I need. (It is bad practice to modify SharePoint's database directly, so I will be using its API) I have been reading on various blogs and MSDN that there are some security concerns with this approach. The sites suggest that you limit security so the command can be executed by only a specific user role. What other tips/suggestions would you recommend with using "xp_cmdshell"? Or should I go about this another way and create a script or console application that constantly checks if a change has been made? I am running Server 2008 with SQL 2008.

    Read the article

  • How to pass binaries build upstream to a remote downstream build slave

    - by sbi
    We're using hudson on Windows to build a .NET solution and run the unit tests (NUnit). Hudson is thereby used to start batch files that do the actual work. I am now trying to set up a new test that is to run on a build slave and will run very long. The test should use the binaries produced by the upstream build. I have searched the hudson documentation but I cannot find how to pass upstream build artifacts to downstream slaves. How do I do this?

    Read the article

  • MySQL command-line tool: How to find out number of rows affected by a DELETE?

    - by ambivalence
    I'm trying to run a script that deletes a bunch of rows in a MySQL (innodb) table in batches, by executing the following in a loop: mysql --user=MyUser --password=MyPassword MyDatabase < SQL_FILE where SQL_FILE contains a DELETE FROM ... LIMIT X command. I need to keep running this loop until there's no more matching rows. But unlike running in the mysql shell, the above command does not return the number of rows affected. I've tried -v and -t but neither works. How can I find out how many rows the batch script affected? Thanks!

    Read the article

  • sql server 2008, not enough disc space

    - by snorlaks
    Hello, Im executing sql query on my database. I have sql server 2008 installed on my D harddrive which has 55 GB free space. I have also C drive which has sth like 150 MB free (right now). While executing that query on quite a big table (16 GB) I have an error: An error occurred while executing batch. Error message is: Not enough disc space. I would like to know if there is any possibility that I can make Sql server to use D drive instead of C Or maybe there is any other problem with what Im doing ? Thanks for help

    Read the article

  • efficient video format/codec for sparse & binary blob tracking

    - by user391339
    I am working on a blob tracking project and have many high-definition videos that I would like to reduce in size for storage and downstream tracking/shape-analysis. I want to use a lossless method that takes advantage of the black and white nature of the video as well as the fact that not much is moving between individual frames. The videos are quite sparse, with 5 to 10 b&w blobs per frame occupying <30% of the space in total, with each blob moving <5-10% of the field of view between frames and not changing shape too much between 2-3 frames. I will work in Python, Matlab, or LabView for this project, and could use a batch utility if available. It may be worthwhile to export the files as compressed image stacks if a proper video format can't be found. What are the pros and cons of this? A video codec uses correlations between neighboring frames, so it should be more efficient, but not if the wrong one is chosen or if it is improperly configured.

    Read the article

  • How to perform multiple find and replace in TextMate besides macro?

    - by T1000
    I have several regular expressions to find and replace text in documents in TextMate. I would like to be able to have them run in a batch. I made a macro and it worked, but any tiny modifications to the macro means re-recording the macro. And I can't seem to modify the regex within the TextMate interface. It's read-only for some reason. Can I make it into a command? Does anyone know how? I tried to read the TextMate help about commands, but it wasn't much help. It seems I need prior knowledge of shell scripts or some sort (which I have none). Any advise in the direction would be great. Thanx in advance.

    Read the article

  • How do I get the Bake console for CakePHP?

    - by ggfan
    I am having trouble getting the Bake console. I am on windows running xampp. I'm doing the IBM cakphp tutorial. Here is my directory: C:\\ xampp htdocs ibm2(a test project--orginally called cakephp) app cake vendors (etc) It says to To use Bake, cd into the /webroot/app directory and launch the Cake Console: ../cake/console/cake bake. You should be presented with a screen that looks like Figure 2. So I write in my command prompt till I am at: C:\xampp\htdocs\ibm2\app Then I type ../cake/console/cake bake but I get this error: '..' is not recognized as an internal or external command, operable program or batch file. What am I doing wrong? I use the window's command prompt

    Read the article

  • [SOLVED} How do I restore my audio after uninstalling Ventrilo?

    - by Marcx
    Hi, I've a Dell studio 1555 bought on september with Windows 7 64bit Professional on it. The audio device works proprerly, while listening to audio contents (from disk or internet) When I use Ventrilo, the audio from other people sounds good and I hear their voices clearly When I use any other VOIP programs like Teamspeak 3, MSN or Skype, I hear a disturbed voice, and it's impossible to comprehend something... Anyway everything worked fine until I installed Ventrilo, but removing it didn´t solve my problem. Update: Here's a sample of how I hear others people voices.. Audio Sample After some tests, also the desktop has the same problem. (I tried TeamSpeak3) Here are some details on my laptop and desktop Laptop Dell Studio 1555 Core 2 Duo P8600 2.4Ghz 4Gb Ram Dual Channel Ati HD 4570 512Mb dedicated (up to 2048) IDT High Definition Audio Desktop Motherboard Asus P5KPL-AM Dual Core CPU E5200 2.50Ghz 2x2GB PC6400 Dual Channel Ati Radeon HD 4650 512MB VIA High Definition Audio Both computers have Windows 7 Professional 64Bit. So how do I restore my audio? SOLVED The problem was in router firmware, there was a bug that recognized VoIP traffic as a DOS attack and the router grambled every packet... I've installed the newest firmware and everything is fine :)

    Read the article

  • deleting a large number of rows from a table

    - by Azeem
    We have a requirement to delete rows in the order of millions from multiple tables as a batch job (note that we are not deleting all the rows, we are deleting based on a timestamp stored in an indexed column). Obviously a normal DELETE takes forever (because of logging, referential constraint checking etc.). I know in the LUW world, we have ALTER TABLE NOT LOGGED INITIALLY but I can't seem to find the an equivalent SQL statement for DB2 v8 z/OS. Any one has any ideas on how to do this really fast? Also, any ideas on how to avoid the referential checks when deleting the rows? Please let me know.

    Read the article

  • In a pre-commit hook - how to access/compare current and previous versions of files

    - by EthanML
    I'm trying to add to our existing pre-commit SVN hook so that it will check for and block an increase in file size for files in specific directory/s. I've written a python script to compare two file sizes, which takes two files as arguments and uses sys.exit(0) or (1) to return the result, this part seems to work fine. My problem is in calling the python script from the batch file, how to reference the newly committed and previous versions of each file? The existing code is new to me and a mess of %REPOS%, %TXN%s etc and I'm not sure how to go about using them. Is there a simple, standard way of doing this? It also already contains code to loop through the changed files using svnlook changed, so that part shouldn't be an issue. Thanks very much

    Read the article

  • php convert images and upload to amazon s3

    - by faraklit
    I am looking for a best practice while uploading images to amazon s3 server and serving from there. We need four different sizes of an image. So just after image upload we convert the image and scale in 4 different widths and heights. And then we send them to the amazon s3 using official php api. // ... // image conversions, bucket setting, s3 initialization etc. $sizes= array("", "48", "64", "128"); foreach($sizes as $size) { $filename = $upload_path.$dest_file.$size.$ext; $s3->batch()->create_object($bucket, , array( 'fileUpload' => $filename, 'acl' => AmazonS3::ACL_PUBLIC, )); } But for a 1M image the client sometimes wait up to 30 seconds which is a very long time. Instead of sending images immediately to S3, it may be better to add them to a job queue. But the user should see the uploaded image immediately.

    Read the article

< Previous Page | 110 111 112 113 114 115 116 117 118 119 120 121  | Next Page >