Search Results

Search found 89549 results on 3582 pages for 'large file support'.

Page 58/3582 | < Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >

  • Quickly compute added and removed lines

    - by Philippe Marschall
    I'm trying to compare two text files. I want to compute how many lines were added and removed. Basically what git diff --stat is doing. Bonus points for not having to store the entire file contents in memory. The approach I'm currently having in mind is: read each line of the old file compute a hash (probably MD5 or SHA-1) for each line store the hashes in a set do the same for each line in the new file every hash from the old file set that's missing in the new file set was removed every hash from the new file set that's missing in the old file set was added I'll probably want to exclude empty and all white space lines. There is a small issue with duplicated lines. This can either be solved by additionally storing how often a hash appears or comparing the number of lines in the old and new file and adjust either the added or removed lines so that the numbers add up. Do you see room for improvements or a better approach?

    Read the article

  • More Efficient Data Structure for Large Layered Tile Map

    - by Stupac
    It seems like the popular method is to break the map up into regions and load them as needed, my problem is that in my game there are many AI entities other than the player out performing actions in virtually all the regions of the map. Let's just say I have a 5000x5000 map, when I use a 2D array of byte's to render it my game uses around 17 MB of memory, as soon as I change that data structure to a my own defined MapCell class (which only contains a single field: byte terrain) my game's memory consumption rockets up to 400+ MB. I plan on adding layering, so an array of byte's won't cut it and I figure I'd need to add a List of some sort to the MapCell class to provide objects in the layers. I'm only rendering tiles that are on screen, but I need the rest of the map to be represented in memory since it is constantly used in Update. So my question is, how can I reduce the memory consumption of my map while still maintaining the above requirements? Thank you for your time! Here's a few snippets my C# code in XNA4: public static void LoadMapData() { // Test map generations int xSize = 5000; int ySize = 5000; MapCell[,] map = new MapCell[xSize,ySize]; //byte[,] map = new byte[xSize, ySize]; Terrain[] terrains = new Terrain[4]; terrains[0] = grass; terrains[1] = dirt; terrains[2] = rock; terrains[3] = water; Random random = new Random(); for(int x = 0; x < xSize; x++) { for(int y = 0; y < ySize; y++) { //map[x,y] = new MapCell(terrains[random.Next(4)]); map[x,y] = new MapCell((byte)random.Next(4)); //map[x, y] = (byte)random.Next(4); } } testMap = new TileMap(map, xSize, ySize); // End test map setup currentMap = testMap; } public class MapCell { //public TerrainType terrain; public byte terrain; public MapCell(byte itsTerrain) { terrain = itsTerrain; } // the type of terrain this cell is treated as /*public Terrain terrain { get; set; } public MapCell(Terrain itsTerrain) { terrain = itsTerrain; }*/ }

    Read the article

  • Writing Large Portions Of Code Then Debugging?

    - by The Floating Brain
    Lately I have been writing a game engine, and I have been writing a lot of "foundation stuff" (standard interfaces, modules, a message system ect.), but I have noticed a pattern, a lot of the stuff is interdependent and I can not debug until everything is done, hence I do not debug for about 3 to 5 hours at a time. I am wondering if this is an acceptable practice for this part of the project, and if not, if anyone can give me some advice? -----Update-----: I downloaded some code metrics tools, and my programs cyclomatic complexity is 1.52 which as I understand it is good, and should correlate to high cohesion, if I am wrong please correct me/

    Read the article

  • Making large scale changes to an economy in a social game

    - by Zach
    Are there any examples or case studies of social games, specifically on Facebook, where the developer has made drastic changes to the economy? I'm specifically interested in examples where the old economy was based off of purchasing items with Facebook credits then moving to a new model where the same inventory or similar inventory is sold with a soft currency. The closest comparisons I've been able to find so far are looking at iOS games that have gone from purchase models to freemium models, but haven't found a comparable scenario in a social game besides larger scale MMO's.

    Read the article

  • Extremely large spike in traffic on the 1st - 4th of every month from mobile browsers

    - by wsanville
    I've noticed that on the 1st - 4th of the recent months (since January), several sites I maintain are getting thousands of requests from mobile browsers, whereas throughout the rest of the month, the numbers are in the single or double digits. Has anybody else noticed this sort of behavior? I don't have the exact user agents logged, but my analysis software (WebTrends) reports the traffic as mostly iPhone/iPad/iPod, Android, and Blackberry.

    Read the article

  • Failed install 12.04 on Intel Hardware Raid with Large Partition (> 2TB)

    - by Michael Wiles
    I have Intel Hardware Raid on the motherboard. I have 10 2 TB HDD that I've configured as RAID 1+0 to be one big 8 TB HDD. Now I'm trying to install ubuntu 12.04 on it. After installing with default desktop installation disk I get a blank screen with a cursor flashing. If I try the alternate guided partitioning option I get error: out of disk. and the grub prompt. If I boot with the rescue disk or such like I can drop into a shell and view the disk. Everything also installs without an issue. Don't know what to do...

    Read the article

  • Large resolution differences

    - by Robin Betka
    I want to develop a game on multiple devices such as PC, Android or IOS. Want it to be in 1080p, but that means a massive scale down for the smartphones. I know how to do that, just render everything on a 1080p rendertarget and then render it on the screen smaller. But what should I do so that the scalling down doesn't look bad and blury? I can't do it vector based or anything because the sprites simply need a specific size. Should I make the sprites power of two size to get some nice mipmapping? And which other settings can I do? Or should I rather go with a lower resolution but then having a little bit worse look PC version? The performance seems not to be a problem for me, so would be sad not using 1080p because of other problems.

    Read the article

  • TSQL Tuesday #15 – Maintaining Your Sanity While Managing Large Environments

    - by Jonathan Kehayias
    This month’s TSQL Tuesday event is being hosted by Pat Wright (Blog | Twitter) and the topic this month is Automation! “ I figured that since many of you out there set a goal this year to blog more and to learn Powershell then this Topic should help in both of those goals. So the topic I have chosen for this month is Automation! It can be Automation with T-SQL or with Powershell or a mix of both. Give us your best tips/tricks and ideas for making our lives easier through Automation.” Automation is...(read more)

    Read the article

  • jQuery 2.0 drops support for legacy IE (IE6, IE7, IE8)

    - by Renso
    Originally posted on: http://geekswithblogs.net/renso/archive/2013/10/31/jquery-2.0-drops-support-for-legacy-ie-ie6-ie7-ie8.aspxjQuery upgrades may not be as reverse compatible as you may think. Starting from version 2 of jQuery, IE6, IE7 and yes also IE8 will no longer be supported. These are now considered legacy browsers. You will need to stop any upgrades to jQuery until your SLA states that IE8 is no longer supported and remain in jQuery 1.9.Some of the reasons for not supporting IE8 and before:- Remove all the code clutter in the jQuery library with code that has to deal with IE browser compatibility issues between IE6, 7 and 8 and the newer IE versions, the latter being more compliant.- IE6 and 7 may have fallen to below 2% generally, that does not mean that that is true for your client base. In the oil and gas industry some clients are years behind and you may have 50% or more of clients remain on IE8 or older for the foreseeable future.- The difference between browser engines has become almost negligible, as it should be. So one of the greatest goals of jQuery to abstract that out for developers is no longer needed, for most part anyway. - CSS3 features like animations basically replace the need for jQuery’s 2.0 animations and effects.If the need is still there to support IE8 or before, but you also want to upgrade, then use conditional comments:<!--[if lt IE 9]>    <script src="jQuery-1.9.0.js"></script><![endif]--><!--[if gte IE 9]><!-->    <script src="jQuery-2.0.0.js"><</script><!--<![endif]-->

    Read the article

  • For a Javascript library, what is the best or standard way to support extensibility

    - by Michael Best
    Specifically, I want to support "plugins" that modify the behavior of parts of the library. I couldn't find much information on the web about this subject. But here are my ideas for how a library could be extensible. The library exports an object with both public and "protected" functions. A plugin can replace any of those functions, thus modifying the library's behavior. Advantages of this method are that it's simple and that the plugin's functions can have full access to the library's "protected" functions. Disadvantages are that the library may be harder to maintain with a larger set of exposed functions and it could be hard to debug if multiple plugins are involved (how to know which plugin modified which function?). The library provides an "add plugin" function that accepts an object with a specific interface. Internally, the library will use the plugin instead of it's own code if appropriate. With this method, the internals of the library can be rearranged more freely as long as it still supports the same plugin interface. This could also support having different plugin interfaces to modify different parts of the library. A disadvantage of this method is that the plugins may have to re-implement code that is already part of the library since the library's internal functions are not exported. The library provides a "set implementation" function that accepts an object inherited from a specific base object. The library's public API calls functions in the implementation object for any functionality that can be modified and the base implementation object includes the core functionality, with both external (to the API) and internal functions. A plugin creates a new implementation object, which inherits from the base object and replaces any functions it wants to modify. This combines advantages and disadvantages of both the other methods.

    Read the article

  • What is the simplest and fastest way to transfer large file through a Windows network?

    - by Sake
    I have a Window Server 2000 machine running MS SQL Server that stores over 20GB of data. The database is backed-up every day to the second harddrive. I want to transfer those backup files to another computer to build another test server and for recovery practicing. (the backup never actually got restored for almost 5 years. Don't tell my boss about that!) I have trouble transfering that huge file through the network. I've tried plain network copy, apache download, and ftp. Any method I tried end up failing when the amount of data transfered reach 2GB. The last time that I successfully transfered the file, it was through a usb attached external harddrive. But I want to perform this task routinely and preferably automatically. Wonder what is the most pragmatic approach for this situation ?

    Read the article

  • browser without gpu support

    - by manuzhang
    Google has an Easter egg that draws 3D graph but when I tried it out on chrome it complained about no WebGL support. I've also tested it on Firefox whose WebGL support was enabled but ended up with the same problem. Thus, I suspect it's an issue of my GPU. Some googling led me to chrome://gpu and here's what I got Graphics Feature Status Canvas: Software only, hardware acceleration unavailable HTML Rendering: Software only, hardware acceleration unavailable 3D CSS: Unavailable. Hardware acceleration unavailable WebGL: Unavailable. Hardware acceleration unavailable WebGL multisampling: Unavailable. Hardware acceleration unavailable Problems Detected GPU process was unable to boot. Access to GPU disallowed. GL driver is software rendered. Accelerated compositing is disabled.: 59302 Mesa drivers in linux older than 7.11 are assumed to be buggy. Accelerated 2d canvas is unstable in Linux at the moment. Version Information Data exported Tue Apr 10 2012 18:35:57 GMT+0800 (CST) Chrome version 18.0.1025.151 (Official Build 130497) Operating system Linux 3.0.0-0300-generic Software rendering list version 1.27 ANGLE revision 988 2D graphics backend Skia I wonder what each of the problem implies and How I may properly deal with it? I'm using Ubuntu 11.04

    Read the article

  • Moving large amounts of data between shared hosts

    - by Bryan M.
    I recently acquired a client who is a photographer and was interested in moving web hosts since his current host had threatened to throw him off due to CPU spiking. The migration went fairly easily, with about 350MBs of website and media files. Then I discovered about 60GBs of client galleries he had failed to mention. I am unable to move this much data myself, since I'm capping out at about 20kb/s on the FTP connection. Has anyone encountered a situation where they needed to migrate this much data between cheap hosting? Should we contact the hosting companies about this (he is moving from Westhost to MediaTemple)?

    Read the article

  • SQL Source Control with Vault Support Officially Released

    - by Ajarn Mark Caldwell
    HOORAY!  It is officially here!  Today, Red-Gate officially released SQL Source Control version 2.1 with support for Vault. While we have been happily and successfully running the beta version (a.k.a. the Early Access release) of Red-Gate SQL Source Control with support for Vault for quite a while, it is good to have the official RTM (or GOLD, or PROD, or whatever you call your “no-longer-in-beta”) release of the product. As a courtesy to those who have not already read the series, allow me to provide you with these links to my previous posts about this fantastic tool. Using SQL Source Control with Fortress or Vault – Part 1 – Introduction and initial thoughts about the tool and source controlling SQL code in general. Using SQL Source Control with Fortress or Vault – Part 2 – Additional details about included features and a few warnings. Source Control and SQL Development – Part 3 – How we did it in the good ol’ days before this product came along. Using SQL Source Control and Vault Professional Part 4 – A few closing thoughts.

    Read the article

  • GUI question : representing large tree

    - by Peter
    I have a tree-like datastructure of some six levels deep, that I would like to represent on a single webpage (can be tabs, trees; ....) In each level both childnodes and content are possible. Presenting it like a real tree would be not very usable (too big). I was thinking in the lines of hiding parts of the tree when you drill down and presenting a breadcrumbs or the like to keep you informed as to where you are... I guess my question boils down to : any ideas / examples ? Tx!

    Read the article

  • Extract large zip file (50G) on Mac OS X

    - by chingjun
    I was trying to move the files to another hard drive. So I archived all my photos in one large zip file using the Mac OS X built-in compress function. But the file failed to extract. I've tried many programs, but none of the programs I tried were able to extract the file. I've tried Mac OS X's extract utility, Stuffit Expander, 7zip (command line), all failed. Mac's archive utility and Stuffit don't seem to support large files, and 7zip's command line version gave an error stating unsupported archive. I have no luck in Windows too as many of my files have Chinese filenames, and couldn't extract to the correct name under Windows. Could anyone please suggest some programs that can support large files, can handle files compressed using Mac OS X's compress function, and can support UTF-8 filename? With or without GUI is fine. Thank you in advance.

    Read the article

  • TSQL Tuesday #15 – Maintaining Your Sanity While Managing Large Environments

    - by Jonathan Kehayias
    This month’s TSQL Tuesday event is being hosted by Pat Wright (Blog | Twitter) and  the topic this month is Automation! “ I figured that since many of you out there set a goal this year to blog more and to learn Powershell then this Topic should help in both of those goals.    So the topic I have chosen for this month is Automation!   It can be Automation with T-SQL or with Powershell or a mix of both.  Give us your best tips/tricks and ideas for making our lives...(read more)

    Read the article

  • design for a parser to handle very large files

    - by user619818
    I have written a program which records protocol messages between an application and a hardware device which matches each application request with each hardware response. This is so that I can later remove the hardware, connect a 'replay' application to the main application and wait for an application request and reply with a matched copy of the requisite hardware reply message. My replay application saves the matched request/response in a list (using C++ std::list). This works fine on a small interaction session. My problem now is that I need to be able to use the replay over a long long session. With my current implementation, the replay program eventually uses up all available memory on my computer and crashes. So I need some sort of lookahead - and not parse the whole session in one go. Can anyone make any suggestions on how to get started?

    Read the article

  • Inheritance vs containment while extending a large legacy project

    - by Flot2011
    I have got a legacy Java project with a lot of code. The code uses MVC pattern and is well structured and well written. It also has a lot of unit tests and it is still actively maintained (bug fixing, minor features adding). Therefore I want to preserve the original structure and code style as much as possible. The new feature I am going to add is a conceptual one, so I have to make my changes all over the code. In order to minimize changes I decided not to extend existing classes but to use containment: class ExistingClass { // .... existing code // my code adding new functionality private ExistingClassExtension extension = new ExistingClassExtension(); public ExistingClassExtension getExtension() {return extension;} } ... // somewhere in code ExistingClass instance = new ExistingClass(); ... // when I need a new functionality instance.getExtension().newMethod1(); All functionality that I am adding is inside a new ExistingClassExtension class. Actually I am adding only these 2 lines to each class that needs to be extended. By doing so I also do not need to instantiate new, extended classes all over the code and I may use existing tests to make sure there is no regression. However my colleagues argue that in this situation doing so isn't a proper OOP approach, and I need to inherit from ExistingClass in order to add a new functionality. What do you think? I am aware of numerous inheritance/containment questions here, but I think my question is different.

    Read the article

  • Vector.Unproject - Checking if a model intersects a large sprite

    - by Fibericon
    Let's say I have a sprite, drawn like this: spriteBatch.Draw(levelCannons[i].texture, levelCannons[i].position, null, alpha, levelCannons[i].rotation, Vector2.Zero, scale, SpriteEffects.None, 0); Picture levelCannon as being a laser beam that goes across the entire screen. I need to see if my 3d model intersects with the screen space inhabited by the sprite. I managed to dig up Vector.Unproject, but that seems to only be useful when dealing with a single point in 2d space, rather than an area. What can I do in my case?

    Read the article

  • Which pattern is best for large project

    - by shamim
    I have several years of software development experience, but I am not a keen and adroit programmer, to perform better I need helping hands. Recently I engaged in an ERP project. For this project want a very effective structure, which will be easily maintainable and have no compromise about performance issue. Below structures are now present in my old projects. Entity Layer BusinessLogic Layer. DataLogic Layer UI Layer. Bellow picture describe how they are internally connected. For my new project want to change my project structure, I want to follow below steps: Core Layer(common) BLL DAL Model UI Bellow picture describe how they are internally connected. Though goggling some initial type question’s are obscure to me, they are : For new project want to use Entity framework, is it a good idea? Will it increase my project performance? Will it more maintainable than previous structure? Entity Framework core disadvantages/benefits are? For my project need help to select best structure. Will my new structure be better than the old one?

    Read the article

  • Getting started on Large Projects

    - by Mercfh
    So I just graduated from my College with a B.S. in Comp. Science (although it was a good school, we're the only accredited CS department in our state.....for w/e that means lol) I feel like im a decent programmer, not amazing....but not terrible. Anyways I got my first job about 2 weeks ago, it's a pretty entry level job: firmware development/tester (I know I know people look down on testers...but I gotta start somewhere). Anyways there isn't a whole lot of coding to be had right now (mostly simple stuff) but here soon I have the option of helping out with development (which is what I want to do) Thing is....I have NEVER worked on a huge project. I mean in school sure we had "group" projects but nothing really big. So I'm not super familiar with HUGE classes and such (main language was C++)....Is this something I'll just get used to with time? Some fellow students were used to that with internships and such...but I never got that chance. My job was mostly a "one man job" kinda thing. Mostly little things. Plus in class we never did huge projects anyways. So how do you guys I guess "plan" out these things? Do you use a whiteboard and plan out classes and such....or what. Also...another worry of mine is that I have to use google......ALOT for examples of code, because sometimes I just don't get how something works. Is this normal? It makes me feel sorta.....stupid I guess. I mean "technically" i've had 4-5 years coding experience......but it really only feels like I had 2 years of REAL experience. If that makes any sense? Thanks

    Read the article

  • Business knowledge in a large financial org?

    - by Victor
    As a programmer working in the finance industry, I recently got a project that is a hedge fund adminsitrative application(used to calculate NAVs, allocate assets etc.) From a business point of view this is a good thing. When we think of our 'next' project, typically the impulse is to think in terms of technology. e.g: 'I want to work on a project that uses SOA/cloud etc etc.' I am interested to know if anyone while career planning also takes into account the business aspect of a future project. i.e. what the application does. So does anybody ever think like this : 'I wish to work on a trading system so I can understand capital markets better.' instead of 'I want to work on a project that uses SOA/cloud etc etc.' I say this because it appears to me in the finance domain, for senior position, good business knowledge pays well. So maybe a guy that knows more business but maybe not so much latest technologies is at an advantage? The rockstar programmer seems more suited for an aggressive startup. Particularly big old finance orgs rarely invest in tech just for the 'cool factor'. No?

    Read the article

  • Why are Back In Time snapshots so large?

    - by Chethan S.
    I just backed up the contents of my home partition onto my external hard drive using Back In Time. I browsed to the backed up contents in the external drive and under properties it showed me the size as 9.6 GB. As I read that in next snapshots I create, Back In Time does not backup everything but creates hard links for older contents and saves newer contents, I wanted to test it. So I copied two small files into my home partition and ran 'Take Snapshot' again. The operation completed within a minute - first it checked previous snapshot, assessed the changes, detected two new files and synced them. After this when I browsed to the backed up contents, I was surprised to see the newer and older backup taking up 9.6 GB each. Isn't this a waste of hard drive space? Or did I interpret something wrongly?

    Read the article

  • Static pages for large photo album

    - by Phil P
    I'm looking for advice on software for managing a largish photo album for a website. 2000+ pictures, one-time drop (probably). I normally use MarginalHack's album, which does what I want: pre-generate thumbnails and HTML for the pictures, so I can serve without needing a dynamic run-time, so there's less attack surface to worry about. However, it doesn't handle pagination or the like, so it's unwieldy for this case. This is a one-time drop for pictures from a wedding, with a shared usercode/password for distribution to the guests; I don't wish to put the pictures in a third-party hosting environment. I don't wish to use PHP, simply because that's another run-time to worry about, I might relent and use something dynamic if it's Python or Perl based (as I can maintain things written in those). I currently have: Apache serving static files, Album-generated, some sub-directories to divide up the content to be a little more manageable. Something like Album but with pagination already handled would be great, but I'm willing to have something a little more dynamic, if it lets people comment or caption and store the extra data in something like an sqlite DB. I'd want something light-weight, not a full-blown CMS with security updates every three months. I don't want to upload pictures of other peoples' children into a third-party free service where I don't know what the revenue model is. (For my site: revenue is none, costs out of pocket). Existing server hosting is *nix, Apache, some WSGI. Client-side I have MacOS. Any advice?

    Read the article

< Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >