Search Results

Search found 11536 results on 462 pages for 'whole foods market'.

Page 296/462 | < Previous Page | 292 293 294 295 296 297 298 299 300 301 302 303  | Next Page >

  • Is there a method / system / program to keep track of different stages and changes in writing the co

    - by Luay
    forgive me, but I don't know the technical term to know what to search for. I am trying to find a way to keep track of changes in my code during the development of my program. something that would allow me to go back to a section of code that I deleted. I am not talking about "undo". But rather a way that would let me keep track or be able to retrieve a section of my code that I deleted but now want it back. Is there such a way. If there is, then what is this whole system/procedure called? Is there something that integrates with visual studio 2010? Many thanks for your help.

    Read the article

  • generating dynamic word documents for mass mailing

    - by bluesystem
    I need to generate a mass mailing based on a word document model with PHP. Given is a database with the adresses and the data that need to be filled in my word model. I want to generate a single word document with the different adresses and field contents from the database. We have a Linux server and the COM object is not avalaible. Is there a ready to use class to do this? Did you had any experiance with PHPWord? What is the best practice in this case? In the ideal case the client should just upload th word master document, with the fields that need to be filled and then fusioned to a multiple pages word document containing the whole mailing.

    Read the article

  • k-means clustering in R on very large, sparse matrix?

    - by movingabout
    Hello, I am trying to do some k-means clustering on a very large matrix. The matrix is approximately 500000 rows x 4000 cols yet very sparse (only a couple of "1" values per row). The whole thing does not fit into memory, so I converted it into a sparse ARFF file. But R obviously can't read the sparse ARFF file format. I also have the data as a plain CSV file. Is there any package available in R for loading such sparse matrices efficiently? I'd then use the regular k-means algorithm from the cluster package to proceed. Many thanks

    Read the article

  • ModalPopupExtender doesn't work on IE6 frame layout

    - by Joe
    I'm using a "frame" layout similar to the one in this excellent answer: a div #top at the top of the page, a div#left on the left, and a div #main with the main content. The #top and #left divs contain navigation menus. Now I want to use a popup div using the AjaxControlToolkit ModalPopupExtender inside the content (#main) div. This works fine on IE8 (where #top, #left, #main all have position:fixed), but when I run it on IE6, the modal background only covers the #main div - I need it to cover the whole of the page including the #top and #left navigation divs. Looking at the script for ModalPopupExtender, it appears to be searching up the parent hierarchy until it finds a parent with position relative or absolute. And in the IE6 rendering, the #main div has position:absolute because position:fixed is not supported, which I guess explains what is happening. Any suggestions for the best/easiest way to get this working properly on IE6? Ideally without modifying the ModalPopupExtender code, but I'll do this if I have to and it's the best solution.

    Read the article

  • MEMORY(HEAP) vs. InnoDB in a Read and Write Envirnment

    - by Johannes
    I want to programm a real-time application using MySQL. It needs a small table (less than 10000 rows) that will be under heavy read (scan) and write (update and some insert/delete) load. I am really speaking of 10000 updates or selects per second. These statements will be executed on only a few (less than 10) open mysql connections. The table is small and does not contain any data that needs to be stored on disk. So I ask which is faster: InnoDB or MEMORY (HEAP)? My thoughts are: Both enginges will probably serve SELECTs directly from memory, as even InnoDB will cache the whole table. What about the UPDATAEs? (innodb_flush_log_at_trx_commit?) My main concern is the locking behavior: InnoDB row lock vs. MEMORY table lock. Will this present the bottleneck in the MEMORY implementation? Thanks for your thoughts!

    Read the article

  • Debugging ASP.NET with IIS

    - by Ariel
    I've set up debugging in Visual Studio 2008 to IIS instead of the built in server so I can run url rewriting while developing (using IIRF). It took a while to get to work (changing permissions, reinstalling the .NET framework) but it was working. I turned off my machine, and now that it's back on the debugger won't start. I'm using Parallels with WinXP on an iMac. "Unable to start debugging on the web server. Could not start ASP.NET debugging. More information may be available by starting the project without debugging" I read a whole bunch of posts on SO as well as googled the subject but none seem to provide a working answer. Has anyone encountered this and knows how to get it to work? Thanks.

    Read the article

  • BufferedReader ready method in a while loop to determine EOF?

    - by BobTurbo
    I have a large file (wikipedia english arcticles only database as xml file) I am using to read one character at a time using BufferedReader. The psuedo code is: file = BufferedReader... while (file.ready()) character = file.read() is this actually valid? Or will ready just return false when it is waiting for the HDD to return data and not actually when the EOF has been reached? I tried to use if (file.read() == -1) but seemed to run into an infinite loop that I literally could not find. I am just wondering if it is reading the whole file as my statistics say 444 380 wikipedia pages have been read but I thought there were many more articles..

    Read the article

  • a completely decoupled OO system ?

    - by shrini1000
    To make an OO system as decoupled as possible, I'm thinking of the following approach: 1) we run an RMI/directory like service where objects can register and discover each other. They talk to this service through an interface 2) we run a messaging service to which objects can publish messages, and register subscription callbacks. Again, this happens through interfaces 3) when object A wants to invoke a method on object B, it discovers the target object's unique identity through #1 above, and publishes a message on the message service for object B 4) message services invokes B's callback to give it the message 5) B processes the request and sends the response for A on message service 6) A's callback is called and it gets the response. I feel this system is as decoupled as practically possible, but it has the following problems: 1) communication is typically asynchronous 2) hence it's non real time 3) the system as a whole is less efficient. Are there any other practical problems where this design obviously won't be applicable ? What are your thoughts on this design in general ?

    Read the article

  • "set -e" in shell and command substitution

    - by ivant
    In shell scripts set -e is often used to make them more robust by stopping the script when some of the commands executed from the script exits with non-zero exit code. It's usually easy to specify that you don't care about some of the commands succeeding by adding || true at the end. The problem appears when you actually care about the return value, but don't want the script to stop on non-zero return code, for example: output=$(possibly-failing-command) if [ 0 == $? -a -n "$output" ]; then ... else ... fi Here we want to both check the exit code (thus we can't use || true inside of command substitution expression) and get the output. However, if the command in command substitution fails, the whole script stops due to set -e. Is there a clean way to prevent the script from stopping here without unsetting -e and setting it back afterwards?

    Read the article

  • Axis xsd:dateTime issue

    - by milostrivun
    Here is the whole thing. I 'm using a webservice, with wsdl2java I generate classes and communicate with another application. Problem is , when I use a method from my generated WS client that is resulting in data that contains some object with data in format xsd:dateTime(Axis 1.4), and that data is represensted by java.util.Calendar object in java has time shifted to GMT(about two hours before my time zone). That results in bad data that I have to substract two hours to display correct value. My question is, considering I didn't work on building that webservice(all I have is wsdl url) where is the problem and can I fix it or the problem is at the side of the webservice creator. If I'm it is not clear what I am asking I will gladly explain as much as I can.

    Read the article

  • Playing animations in sequence in objecive c

    - by Ohmnastrum
    I'm trying to play animations in sequence but i'm having issues playing them as a for loop iterates through the list of objects in an array. it will move through the array but it won't play each one it just plays the last... -(void) startGame { gramma.animationDuration = 0.5; // Repeat forever gramma.animationRepeatCount = 1; int r = arc4random() % 4; [colorChoices addObject:[NSNumber numberWithInt:r]]; int anInt = [[colorChoices objectAtIndex:0] integerValue]; NSLog(@"%d", anInt); for (int i = 0; i < colorChoices.count; i++) { [self StrikeFrog:[[colorChoices objectAtIndex:i] integerValue]]; //NSLog(@"%d", [[colorChoices objectAtIndex:i] integerValue]); sleep(1); } } it moves through the whole cycle really fast and sleep isn't doing anything to allow it to play each animation... any suggestions?

    Read the article

  • Fastest way to convert file from latin1 to utf-8 in python.

    - by xsaero00
    I need fastest way to convert files from latin1 to utf-8 in python. The files are large ~ 2G. ( I am moving DB data ). So far I have import codecs infile = codecs.open(tmpfile, 'r', encoding='latin1') outfile = codecs.open(tmpfile1, 'w', encoding='utf-8') for line in infile: outfile.write(line) infile.close() outfile.close() but it is still slow. The conversion takes one fourth of the whole migration time. I could also use a linux command line utility if it is faster than native python code.

    Read the article

  • Connection Pooling

    - by cshah
    I have the following code, If I use conn == null in finally do I still use connection pooling? I know it is a good practice to close your connection but how about disposing the whole connection object? public void ExecuteNonQuery(SqlCommand Cmd) { //========== Connection ==========// SqlConnection Conn = new SqlConnection(strConStr); try { //========== Open Connection ==========// Conn.Open(); //========== Execute Command ==========// Cmd.Connection = Conn; Cmd.CommandTimeout = 180; Cmd.ExecuteNonQuery(); } catch (Exception Exc) { throw Exc; } finally { //======== Closing Connection ========// if (Conn.State == ConnectionState.Open) { Conn.Close(); } //======== Disposing object ========// Conn = null; } }

    Read the article

  • Console-App to get all open files for processes

    - by t.kehl
    Hi I am searching for a console-app (where I can pipe the output to a txt-file) which gives me a list of all current processes and the files which each process has open. The tool should also work when the user doesn't has administrativ-privilegues and it should also give file-path which are located on the network (UNC and absolute/mappings). Is there something like this which I can call from another tool and get the information? I am on a windows system. I have a open filename and need now to get the whole path for the file

    Read the article

  • Cloud sync between iPad/iPhone app

    - by Macatomy
    I have a Core Data app that will end up being an iPhone/iPad universal application. I would like to implement cloud syncing so that an iPhone and an iPad both running the app could share data. I'm planning to use the recently released Dropbox API. Does anyone have any thoughts on the best way to go about doing this? The Dropbox API allows for apps to store files on the cloud. What I was thinking was to original store the database (sqlite) for the app on the cloud and then download that database, but I then realized that using that method would make it painfully difficult to merge changes (rather than replacing the whole database). Any thoughts are appreciated. Thanks.

    Read the article

  • How can I persist a large Perl object for re-use between runs?

    - by Alnitak
    I've got a large XML file, which takes over 40 seconds to parse with XML::Simple. I'd like to be able to cache the resulting parsed object so that on the next run I can just retrieve the parsed object and not reparse the whole file. I've looked at using Data::Dumper but the documentation is a bit lacking on how to store and retrieve its output from disk files. Other classes I've looked at (e.g. Cache::Cache appear designed for storage of many small objects, not a single large one. Can anyone recommend a module designed for this? EDIT. The XML file is ftp://ftp.rfc-editor.org/in-notes/rfc-index.xml On my Mac Pro benchmark figures for reading the entire file with XML::Simple vs Storable are: s/iter test1 test2 test1 47.8 -- -100% test2 0.148 32185% --

    Read the article

  • What good practices, if any, has the agile movement lost?

    - by clarke ching
    I am a long time agile advocated but one of the things that bothers me about Agile is that a lot of agile practitioners, especially the younger ones, have thrown out or are missing a whole lot of good (non Scrum, non XP) practices. Alistair Cockburn's style of writing Use Cases springs to mind; orthogonal arrays (pairwise testing) is another. I hope this is an okay forum to ask this, but since I read mostly Agile related books and articles and work with mostly Agile folk ... is there anything I'm missing? Thanks for all your help. StackOverlow is a fantastic resource.

    Read the article

  • .rdlc reporting bound to Object Data Source in Three layer Application

    - by Saeedouv
    Hi, i have the following situation, i have a Reporting layer(stand alone) in asp.net application(NOT website, this means NO App_Code folder exists), and i want just to create Object Data Source to take an Object in a separate layer(lets say from Data Access Layer), and then to use that Object Data Source to create a report, i have spent my whole day working around that, tons of work around's and articles on the web, but does not mention what i really want to do, any answer is appriciated... just to make things more clear here, assume the following: i have a solution with the follwoing layers, UI Reporting(has NO Employees object) just a reference Business Logic Data Access Layer(Employees--GetEmployees(), all i need is as mentioned above, i want to create Object Data Source from Reporting layer, to take Employee object from DAL, and then use it's GetEmployees method to be added to report, i think its more clear now, since also Reporting layer has NO App_Code folder.

    Read the article

  • Which Language to target on Ubuntu?

    - by WeNeedAnswers
    I'm a c# programmer by trade and looking to move my wares over to Ubuntu as a business concern. I have some experience of Python and like it a lot. My question is, as a developer which would be the best language to use when targeting ubuntu Mono c# or python as a commercial concern. please note that I am not interested in the technical aspects but strictly the commercials of where Ubuntu is heading, I see that there is a lot of work done within using Python and thinking that maybe with the whole Mono issue of who "might" purchase them.

    Read the article

  • Remove parent xml tag based on child value

    - by cru3l
    For example, we have xml file with this format: <A> <B> <C></C> <D></D> <D></D> </B> </A> i need that: if all "D"-tags elements are empty, then we need to delete whole "A"-tag element and, of course, we need to do this with all "A"-tags in xml.

    Read the article

  • Tips about a good class-structure for website? (php)

    - by Martti Laine
    Hello I'm creating a kind of massive network for users to register and login. I want to try using classes, but I've never used them (expect some mysql-wrappers etc). Could you provide some tips and sample-structure for my project? The idea is to simply have a index.php, which prints the whole page and does all the action. Index.php calls functions from classes inside other files. I need: user-class for checking if logged in and retrieving user-info different kind of "page"-classes for functions needed in those pages I'm not asking for full code, but just a start. I don't know, how to use public functions or anything like that. How to wrap these classes to work together? So no functions, just the structure! Martti Laine

    Read the article

  • Select Columns Only if String length is greater than 2

    - by Zee-pro
    Similar Question may be asked but I am unable to find anything that fits my needs. How can I select only columns where string length is greater than 2 This is how much has done yet. SELECT * FROM Table1 WHERE (Table1.ID = @ID) Or something like WHERE (Table1.ID = @ID) AND (LEN(*) > 2) Thank for all of your help I have a Table, in which I have 35 columns and a User ID column, now I want to select and display information from only those columns which have 2 string. I Like to Select only columns which have 2 string and the defined ID by User not the Whole Row !! I hope I am making sense. Table Desired Result DI 35 Lesson 4 Maths Lesson 9 ICT Lesson 12 English

    Read the article

  • Conceptually, how does replay work in a game?

    - by SnOrfus
    I was kind of curious as to how replay might be implemented in a game. Initially, I thought that there would be just a command list of every player/ai action that was taken in the game, and it then 're-plays' the game and lets the engine render as usual. However, I have looked at replays in FPS/RTS games, and upon careful inspection even things like the particles and graphical/audible glitches are consistent (and those glitches are generally *in*consistent). So How does this happen. In fixed camera angle games I though it might just write every frame of the whole scene to a stream that gets stored and then just replays the stream back, but that doesn't seem like enough for games that allow you to pause and move the camera around. You'd have to store the locations of everything in the scene at all points in time (No?). So for things like particles, that's a lot of data to push which seems like a significant draw on the game's performance whilst playing.

    Read the article

  • Any way to chunk gzip with Apache and PHP

    - by donatJ
    I have a web application on a site that takes a while (~10 seconds) to complete a portion of the page near the bottom - it has been as optimized as it can be, and caching is not an option. We have compression enabled on the server via an .htaccess directive SetOutputFilter DEFLATE the problem is this causes the whole page to be held until completion before it starts outputting to the user, this is not optimal as the user sees nothing until the page completes. I have also tried it via the php ob_start("ob_gzhandler"); method. Currently I have a <FilesMatch > in my .htaccess restricting this specific script from being compressed. Basically my question is this - Is there a way to say chunk gzip or deflate so that the user gets it in pieces, so they can see that the page has begun loading?

    Read the article

  • It is said that Mercurial's "hg clone" is very cheap... but it is 400MB on my hard drive? (on Mac OS

    - by Jian Lin
    I have a project I cloned over the network to the Mac hard drive (OS X Snow Leopard). The project is about 1GB in the hard drive du -s 2073848 . so when I hg clone proj proj2 then when I MacBook-Pro ~/development $ du -s proj 2073848 proj MacBook-Pro ~/development $ du -s proj2 894840 proj2 MacBook-Pro ~/development $ du -s 2397928 . so the clone seems not so cheap... probably around 400MB... is that so? also, the whole folder grew by about 200MB, which is not the total of proj and proj2 by the way... are there some links and some are not links, that's why the overlapping is not counted twice?

    Read the article

< Previous Page | 292 293 294 295 296 297 298 299 300 301 302 303  | Next Page >