Search Results

Search found 3618 results on 145 pages for 'huge'.

Page 36/145 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • What Design Pattern is seperating transform converters

    - by RevMoon
    For converting a Java object model into XML I am using the following design: For different types of objects (e.g. primitive types, collections, null, etc.) I define each its own converter, which acts appropriate with respect to the given type. This way it can easily extended without adding code to a huge if-else-then construct. The converters are chosen by a method which tests whether the object is convertable at all and by using a priority ordering. The priority ordering is important so let's say a List is not converted by the POJO converter, even though it is convertable as such it would be more appropriate to use the collection converter. What design pattern is that? I can only think of a similarity to the command pattern.

    Read the article

  • OpenWorld Presentations and Anatomy of an RTF Template w/ files

    - by mdonohue
    For those who missed it ... or those who made it and couldn't get enough, check out the presentations delivered at OpenWorld: Overview and Roadmap The Reporting Platform for Oracle Applications Best Practices and even though it wasn't presented at OpenWorld an updated version of Anatomy of an RTF Template to include documented example files  (RTF template, Sub-Template and sample XML data) so you can re-use and play with the code directly.  Huge thanks to Tim and Hok-Min who did all the hard, original work on this example loaded with tips and tricks.  

    Read the article

  • In what cases should I install (and configure) Postfix as a desktop user?

    - by Gonzalo
    Possible cases: 1) I plan to do Debian packaging (this case is the motivation since postfix gets installed as a dependency of some development packages, so it means that in such a case might be necessary). 2) I plan to use Evolution and a Internet provider mail account. 3) I plan to use gmail. Surely if I read Postfix documentation I may find the answer, but its huge and couldn't find it. In any case how (or where) should I find the answer to a question like that by myself? (I really tried)

    Read the article

  • Site Directory in SharePoint 2010

    - by Enrique Lima
    Sometimes there are so many changes to the environment in which we work and live in that something will escape our attention. In discussing functionality with a client today, the topic of Site Directory came up and the comment of “well, we use it but it is too bad it is not available in 2010” was made … and huge sense of huh?!? came to me. Well, the item that escaped me came to light (and I know there are others).  So researching about it was not only an option it was/is my duty. Found an excellent post by Bill Baer on the subject.  And not only that but there is a reference to a potential and possible solution.  This is something that is being worked on through codeplex, here is the link .. http://spsitedirectory2010.codeplex.com/

    Read the article

  • Counting product releases if you work on the backend/online services?

    - by stackoverflowuser2010
    I am trying to update my resume, and I would like to count the number of "product releases" that I was directly involved in with a company. It would seem to serve as a performance metric. The problem is that I was working on the backend of a very large distributed system, like along the lines of Hadoop or other huge database. We had regular 6-month major releases and other minor releases. My manager kept saying that "shipped" these releases, but "shipping" a product to me sounds like releasing single pieces of software, like Microsoft would ship Office 11 or something. Any ideas on "product releases" for backend service engineers, or any other type of performance metric?

    Read the article

  • Cocos2D Command-Line Application

    - by Hasyimi Bahrudin
    Is it possible to create a terminal application which uses cocos2d? I've tried to make one using cocos2d 2.x, but it requires a MacGLView to be initialized. I need it so that I could program a terminal application that generates a screenshot given a TMX file and an optional preferred width or height parameter (for resizing). Then I can automate the generation of map previews for my game, instead of manually taking screenshots. It's not practical to load the actual TMX and resize it inside the game (what I'm currently doing), because each TMX file has 7 layers, my tile sheet is huge, and I have lots of levels.

    Read the article

  • What's the best web entrepeneur's conference to go on

    - by user4845
    Our dream as business partners was always to go to a conference overseas. Now that we can finally afford to, we have no clue of which one to go to. Which conference would you guys suggest from a web entrepeneur's point of view, if there were just one to go to? Perhaps something that included a bit of marketing, new thinking, innovation, inspiration. We were very keen on Google IO previously, but were concerned that it would be very Google product focused. Huge thanks!

    Read the article

  • PHP efficiency question [closed]

    - by Ron
    Hello everyone. I am working on website and I am trying to make it fast as much as possible - especially the small things that can make my site a little bit quicker. So, my to my question - I got loop that run 5 times and in each time it echo something, If I'll make variable and the loop will add the text I want to echo into the variable and just in the end I'll echo the variable - will it be faster? loop 1 (with the echo inside the loop) for ($i = 0;$i < 5;$i++) { echo "test"; } loop 2 (with the echo outside [when the loop finish]) $echostr = ""; for ($i = 0;$i < 5;$i++) { $echostr .= "test"; } echo $echostr; I know that loop 2 will increase a bit the file size and therefore the user will have to download more bytes but If I got huge loop will it be better to use second loop or not? Thanks.

    Read the article

  • How does sprite customization in 2D games work?

    - by Alouette
    I´m working on the design of a new game concept at the moment and I would like to know how to handle a customization of sprites. (In 2D that is, hence the topic.) This is my scenario: The player will have a tower containing 3 floors (or more). Each floor can be replaced by another "piece", i.e. a blue floor, a fire floor, a stone floor. With the little knowledge of game development I have, creating a sprite for each possible combination is probably not a good idea, since the size of the game file will be HUGE. So, how does developers solve this? Do you put a standard position and just replace the sprite itself? Any advice or information about this would be great. Regards.

    Read the article

  • Master Data Management - The Trend Towards Multi-Domain and Other Realities

    - by Mala Narasimharajan
    In my quest to keep my fingers on the pulse of MDM, I recently found a pretty interesting article.  The article was published in Information Week and provides some interesting statistics from a recent survey conducted by the analyst firm, The Information Difference.  Let's take a look: Of the 130 organizations surveyed, 53% have live operational MDM implementations 81% of those with live operational MDM implementations report broad success - a huge improvement over 2011's 54% 64% developed a business case prior to their MDM deployment, while a daring 32% went ahead without a business case.    The article goes on to talk about the shift in vendors from focusing on customer data and product information management to one that is oriented around multi-domain master data management as well as other realities around MDM.  Take a look at the article. For more information on Oracle's master data management suite, click here. 

    Read the article

  • T-SQL Tuesday #028: Whaddya Mean, “Not Your Job?”

    - by merrillaldrich
    This T-SQL Tuesday, hosted by Argenis Fernandez ( Blog | Twitter ) is devoted to the question, “Are you a Jack-of-all-Trades? Or a specialist?” This question really hits home for me, on a number of levels. (Aside: I have huge respect for Argenis – he’s smart, funny, no-nonsense, very accomplished. If you don’t follow him, do.) If you have read any of my previous ramblings on this blog, you may know I was originally educated as an architect – the bricks and mortar kind, not the information systems...(read more)

    Read the article

  • How do I recover files from U1 cloud

    - by MarkWW
    OK I have just stopped syncing a folder. This appears to have been a huge mistake. I did not realise the folder would disappear from my cloud storage. I assume (and hope) the sub-folders/files it contained still exists in the cloud. I hope they still exists in the cloud because they do not exist on my computer. I did not delete these files from the cloud (I have tried to recover deleted files - none are being recovered). Where are they and how do I get them back into My storage?

    Read the article

  • How to organize my 1000s of PDF?

    - by mmb
    I have a huge collection of PDF. Mostly it consists of research papers, of self-created documents but also of scanned documents. Right now I drop them all in one folder and give them precise names with tags in the filename. But even that gets impractical, so I am looking for a PDF library management application. I am thinking of something like Yep for Mac, with the following features: PDF cover browsing (with large preview, larger than Nautilus allows) tagging of PDF (data should be readable cross-platform) possibility to share across network (thus rather flat files than database) if possible: cross-platform Mendeley seemed to be a good choice, but I am not only having academic papers and don't want to fill it all metadata that is required there. The only alternative I could find thus far is Shoka, but the features are limited and developments seems to have stopped already.

    Read the article

  • How are dependant quests generated in Guild Wars 2?

    - by Aufziehvogel
    I recently read that Guild Wars 2 uses a system where the creation of quests depends on which actions user took when they were presented another quest. An example was: There might be a quest to protect a person. If users do not take this action, the person might be kidnapped and later there is a quest to rescue this person. Is there any information on whether the creation of these quests is somehow automatic? From the article it sounded like automatically, but from the specific example you could also guess that people just created a task-set where they added conditions (Task 1 taken: OK; Task 1 not taken: Show Task 2). From what I heard about AI they might also have implemented some sort of a huge neural network to make decisions?

    Read the article

  • Getting weather information (from a thermometer and a hygrometer)

    - by EKS
    I have decided, as a arrogant geek, to build my own home ventilation and heating system, and will try to do this as my little project. I have always been annoyed with the lack of good ventilation systems at work, so I accept building my own is arrogant. Does anyone know about a device I can interact with that allows me to get temperature and humidity that I can interact with using C#? I cannot get it from the Internet because I need to get the humidity from my server "room", so I can control the dehumidifier there. Similar with temperature, outside is not that important. It would be a huge plus if the sensors had some sort of wireless access.

    Read the article

  • HTTP resource bundling/streaming practice

    - by icelava
    Our SPA (plain HTML and Javascript) makes use of huge volume of javascript and other resources that are downloaded via XHR. Given the sheer number of components and browser simultaneous request limits, we're thinking for ways to deliver our resources in a more efficient manner. A method we're considering is bundling several resources that logically form a coherent group into a single file; thus reducing down to only one XHR (per group). Furthermore to make it more responsive, we'd like to constantly inspect the partial responseText during the LOADING state, determining if a usable chunk (atomic resource) has already been downloaded, and make it available for deserialization/processing even before the XHR is DONE. (a stream-like experience) We're thinking surely somebody else would've considered roughly the same approach before, but haven't really come across any library/framework or container file format that is suitable for our scenario. Anybody else know of something similar?

    Read the article

  • Is client-side HTML5/JavaScript too lame after you've worked on server-side C++/Java?

    - by stackoverflowuser2010
    I'm an experienced C++/C/Java/C# research software engineer and have worked on large-scale server systems, including huge map-reduce and database systems. Now I've been offered a new job working with client-side mobile technologies involving Javascript and HTML5 as well as some very minor native iPhone and Android programming. So, question: If you've ever made this kind of jump, did you find find Javascript/HTML too lame after you've been working on "hard-core" C++ and server systems? Did you find it challenging? Did you get bored?

    Read the article

  • Languages and tools that are "portable" (work well from a USB storage drive) [closed]

    - by CodexArcanum
    I'm a huge fan of running programs from my portable hard drive: it means I always have my favorite tools no matter what computer I'm on. Sadly, development tools seem to be hard to get portable at times. I recently realized that the "portable" version of MinGW I was using off my USB drive was actually interfering with a locally installed version of MinGW, so sometimes even the tools you think are portable, aren't. So what are the best portable development tools that you've used? What runs well on a portable media, leaves the host machine clean, and generally makes moving around easier for you?

    Read the article

  • Kernel Linux : la version stable 2.6.38 est disponible, elle optimise la fonction de résolution de la couche VSF

    Kernel Linux : la version stable 2.6.38 est disponible Elle optimise la fonction de résolution de la couche VSF La version stable 2.6.38 du noyau Linux vient d'être rendue disponible et annoncée officiellement par Linus Torvalds. Cette version apporte des améliorations importantes au niveau des performances. Le Kernel intègre désormais le support transparent des « huge pages » (TPH) qui permet d'obtenir de meilleures performances sur des charges de travail qui nécessitent beaucoup de mémoire (on pense aux serveurs JVM et serveurs de base de données). TPH utilise des pages mémoires de grandes tailles (2 Mb) par opposition aux pages traditionnels de 4 Ko. Une autre nouveauté est l...

    Read the article

  • Using Git in Enterprise environment

    - by sarat
    Git is an excellent version control. If we exclude the fact that, it doesn't have an excellent GUI support, it's really good and fast. But the source controls like Clearcase has large support for enterprise customers. Companies investing huge amount for source control servers and licesense. Of late most of the large companies like Google adopting Git over the other version controls. But the company is having strong open source group which consistently provide development and support for the tool (Even they might be having a custom version of Git of their own). At the same time, large companies are not really bothered about adopting open source projects and make it relevant for them. Is Git really a reliable tool for enterprise environment, especially for Windows Platform? The support is a question for Git as it's an open source version control. Any companies are there to provide solutions and support? How the server costs comparing to other version controls like Clear-case?

    Read the article

  • General usage question of vbo

    - by CSharpie
    Firstofall, I am sorry if my question is to broad. I am developing a tile based game and switched from those gl.Begin calls to using VBOs. This is kind of working allready, I managed to render a hexagonal polygon with a simple shader applied. What I am not sure is, how to implement the "whole" tile concept. Concrete the questions are: - Is it better to create 1 VBO for a single tile and render it n-Times in every different position, or render one huge VBO that represents the whole "world" - Depending on the answer above, what is the best way to draw a "linegrid". Overlay with the same vbo using the respecting polygon.mode , or is there a way to let the shader to this? - How would frustum-culling or mousepicking work then, do i need to keep the VBO-data in memory?

    Read the article

  • Why does my FPS drop gradually over time?

    - by mmankt
    I'm working on this game: yt alpha preview I came into a huge game-breaking problem - after 10-15 min of gameplay the FPS drops from 60 to 30 and is very unstable. I'm using tons of physics and particles, I'm deleting and nulling everything I can after it's supposed to be removed, I remove objects from vectors etc. The memory usage is stable at around 150mb so a leak is unlikely (or invisible?)- after a round ends and I delete everything, play a new round and performance is still terrible. I spent two days trying to figure this out and I just can't fix it. Maybe I'm missing something? I know it's hard to help with no code but I would just have to post my whole source.

    Read the article

  • What techniques can I use to render very large numbers of objects more efficiently in OpenGL?

    - by Luke
    You can think of my application as drawing a very large ball-and-stick diagram (or graph). At times, this graph can get very large, where the number of elements even outnumbers the pixels on the screen. Currently I am simply passing all of my textures (as GL_POINTS) and lines to the graphics card using VBO's. When the number of elements outnumbers the number of pixels, is this the most efficient way to do this? Or should I do some calculations on the CPU side before handing everything over to the GPU? If it matters, I do use GL_DEPTH_TEST and GL_ALPHA_TEST. I do some alpha blending, but probably not enough to make a huge performance difference. My scene can be static at times, but the user has control over a typical arc-ball camera and can pan, rotate, or zoom. It is during these operations that performance degradation is noticeable.

    Read the article

  • Disable "Ubuntu 14.04" and "Files" from auto sorting alphabetically and THEN<(Disable the then.) automatically going to that location

    - by Doodski Man
    I looked about in Ubuntu 14.04 and I could not find a option to disable a feature that made extra work for me. After renaming something "Files" then reorganizes alphanumerically and then automatically moves the prompt/cursor/highlighting to the new file name at the new alphanumerical location. It's much easier and practical if the working cursor remains in the same area where the file was found that needed name changes or needed name spelling corrections. Windows 7 does what I ask about and it's a huge improvement over that auto sort and then go to the new file location. A example to consider is a file is in the "A" area and then you rename it to a "Z" and you are then taken to the "Z" stuff and need to scroll hundreds of lines back to the "A" section. It's annoying. Thanks for reading and taking the time.

    Read the article

  • How to structure my java packages

    - by MightyPork
    I have a Java library, quite a huge one. I'm asking regarding Best Practices of structuring the source. For example, the logging sybsystem: Option 1: All in one package, named to sort nicely Log - static accessor LogMonitor - interface for log minotor LogMonitorBase - abstract class LogMonitorStdout - print log to console LogWriter - interface for file logger LogWriterSimple - log writer with just one log file LogWriterArchiving - log writer that handles old log files Option 2: Subpackages for Monitors and Writers, with better names Log monitors/LogMonitor monitors/BaseMonitor monitors/StdoutMonitor writers/LogWriter writers/SimpleLog writers/ArchivingLog The second maybe looks better, but perhaps it's not so practical from the java point of view (two extra packages). What do you suggest as the best practice here? A lot in one package, grouped by naming prefixes, or a lot of subpackages with better names?

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >