Search Results

Search found 80052 results on 3203 pages for 'data load performance'.

Page 43/3203 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • Ray Tracing concers: Efficient Data Structure and Photon Mapping

    - by Grieverheart
    I'm trying to build a simple ray tracer for specific target scenes. An example of such scene can be seen below. I'm concerned as to what accelerating data structure would be most efficient in this case since all objects are touching but on the other hand, the scene is uniform. The objects in my ray tracer are stored as a collection of triangles, thus I also have access to individual triangles. Also, when trying to find the bounding box of the scene, how should infinite planes be handled? Should one instead use the viewing frustum to calculate the bounding box? A few other questions I have are about photon mapping. I've read the original paper by Jensen and many more material. In the compact data structure for the photon they introduce, they store photon power as 4 chars, which from my understanding is 3 chars for color and 1 for flux. But I don't understand how 1 char is enough to store a flux of the order of 1/n, where n is the number of photons (I'm also a bit confused about flux vs power). The other question about photon mapping is, if it would be more efficient in my case to store photons per object (or even per Object's triangle) instead of using a balanced kd-tree. Also, same question about bounding box of the scene but for photon mapping. How should one find a bounding box from the pov of the light when infinite planes are involved?

    Read the article

  • How to show or direct a business analyst to do data modelling?

    - by AaronLS
    Our business analysts pushed hard to collect data through a spreadsheet. I am the programmer responsible for importing that data. Usually when they push hard for something like this, I never know how well it will work out until a few weeks later when I have time assigned to work on the task of programming the import of the data. I have tried to do as much as possible along the way, named ranges, data validations, etc. But I usually don't have time to take a detailed look at all the data and compare to the destination in the database to determine how well it matches up. A lot of times there will be maybe a little table of items that somehow I have to relate to something else in the database, but there are not natural or business keys present that would allow me to do so. Make the best of this, trying to write something that can compare strings and make a best guess at it and then go through the effort of creating interfaces for a user to match the imported data to the destination. I feel like if the business analyst was actually creating a data model, they would be forced to think about these relationships, and have an appreciation for the need of natural or business keys to be part of the spreadsheet for the purposes of smoothly importing the data. The closest they come to business analysis is a big flat list of fields, and that would be fine if it were like any other data dictionary and include data types+relationships, but it isn't. They are just a bunch of names. No indication of what type of data they might hold, and it is up to me to guess. When I have pushed for more detail, they say that it is just busy work. How can I explain the importance of data modelling? How can I tell them what it is and how to do it? It feels impossible, because they don't have an appreciation for its importance. They do however, usually have an interest in helping out in whatever way they can, it's just this in particular has never gotten a motivated response.

    Read the article

  • Graph data structures and journal format for mini-IDE

    - by matec
    Background: I am writing a small/partial IDE. Code is internally converted/parsed into a graph data structure (for fast navigation, syntax-check etc). Functionality to undo/redo (also between sessions) and restoring from crash is implemented by writing to and reading from journal. The journal records modifications to the graph (not to the source). Question: I am hoping for advice on a decision on data-structures and journal format. For the graph I see two possible versions: g-a Graph edges are implemented in the way that one node stores references to other nodes via memory address g-b Every node has an ID. There is an ID-to-memory-address map. Graph uses IDs (instead of addresses) to connect nodes. Moving along an edge from one node to another each time requires lookup in ID-to-address map. And also for the journal: j-a There is a current node (like current working directory in a shell + file-system setting). The journal contains entries like "create new node and connect to current", "connect first child of current node" (relative IDs) j-b Journal uses absolute IDs, e.g. "delete edge 7 - 5", "delete node 5" I could e.g. combine g-a with j-a or combine g-b with j-b. In principle also g-b and j-a should be possible. [My first/original attempt was g-a and a version of j-b that uses addresses, but that turned out to cause severe restrictions: nodes cannot change their addresses (or journal would have to keep track of it), and using journal between two sessions is a mess (or even impossible)] I wonder if variant a or variant b or a combination would be a good idea, what advantages and disadvantages they would have and especially if some variant might be causing troubles later.

    Read the article

  • Turning a problem into a data

    - by Fogmeister
    OK, I have an app that I'm creating but I'm just really not sure about how to approach the problem. The idea is fairly simple. I'm just not sure how to wrap it in a data model (or even if I should). TBH I feel like I'm making it more complicated than it needs to be. How it works. The app will have circles along the top in a row that need to be connected to circles along the bottom in a row. 10 circles at the top. 10 at the bottom. One connection per pair of dots. Anyway, I can get the dots to connect I'm just not sure how to wrap it in a data model so that I can analyse what has been connected and see if it is right or not. The circles will be questions and answers. I can make an array of question objects with a question and answer properties. I can then display these as the dot pairs. I'm just not sure how to record which questions have been connected to which answers. It is valid for a user to connect a wrong answer as they all get checked at the end. I was thinking of using SpriteKit but this isn't a restriction. I could use UIKit or something else. TBH, this question is fairly language free as I'm just after a way of modelling it.

    Read the article

  • Enforcing Constraints Upon Data Documents of Various Formats

    - by Christopher Berman
    This seems like the sort of problem that must have been solved elegantly long ago, but I haven't the foggiest how to google it and find it. Suppose you're maintaining a large legacy system, which has a large collection of data (tens of GB) of various formats, including XML and two different internal configuration formats. Suppose further that there are abstract rules governing the values these files may or may not contain. EXAMPLE: File A defines the raw, mathematical data pertaining to the aerodynamics of a car for consumption of the physics component of the system. File B contains certain values from File A in an easily accessible, XML hierarchy for consumption of a different component of the system. There exists, therefore, an abstract rule (or constraint) such that the values from File B must match the values from File A. This is probably the simplest constraint that can be specified, but in practice, the constraints between files can become very complicated indeed. What is the best method for managing these constraints between files of arbitrary formats, short of migrating it over to an RDBMS (which simply isn't feasible for the foreseeable future)? Has this problem been solved already? To be more specific, I would expect the solution to at least produce notifications of violated constraints; the solution need not resolve the constraints. ============================== Sample file structures File A (JeepWrangler2011.emv): MODEL JeepWrangler2011 { EsotericMathValueX 11.1 EsotericMathValueY 22.2 EsotericMathValueZ 33.3 } File B (JeepWrangler2011.xml): <model name="JeepWrangler2011"> <!--These values must correspond File A's EsotericMathValues--> <modelExtent x="11.1" y="22.2" z="33.3"/> [...] </model>

    Read the article

  • What will be the better way for data retrieval on application that needs to handle limited amount of data?

    - by Milanix
    Just moved this question from Stack Overflow. Since, adding my code snippets itself would make this question really long. Instead, I am pretty interested in knowing a better ways for data retrieval on application that needs to handle limited amount of data which isn't updated regularly. Let's take this example: I am writing an application which gets a schedule as an XML from server. I have written a logic in order to parse XML version and update database only if the version is newer than the local version. Although the update is checked automatically/manually on daily basis based on user preference, the actual version update happens only once per few months or so. Since, this is done by some other authority which doesn't provide API but, rather inform publicly on their changes. The actual XML contains a "(n number of groups)(days in a week) (n number of schedule)" . The group is usually 6 and the number of schedule is usually 2. So basically there would usually be only around 100 strings. Now although I have used SQLite at the moment. I want to know how to make update on database. Should I show progress dialog that the application is updating and exit the app when it's done? Since, my updates are infrequent i don't think this will really harm user experience but, is there any better ways to do it? Because I don't want update to be made when user is searching which is done using database. This will cause an database already open exception. At least I have faced this problem before. Is it better to rather parse XML every time when user wants to view certain things or to use SQLite? Since, I make lots of use of adapter in my app to create lists, will that degrade the performance?

    Read the article

  • Unmet Dependencies with kdelibs5-data

    - by Jitesh
    I was trying to install Amarok 1.4 on Ubuntu 12.04 (Gnome-classic), by following this instructions. Problem started after giving these two commands dpkg -i kdelibs5-data_4.6.2-0ubuntu4_all.deb dpkg -i kdelibs-data_3.5.10.dfsg.1-5ubuntu2_all.deb Now, immediately after these commands, Ubuntu Updater popped up and gave me an error that the package catalog is broken and needs to be repaired. Nothing can be installed or removed till then. It also offered a suggestion to run apt-get install -f. I tried that, but again got the same error.Also tried apt-get clean followed by apt-get install -f. Again got the following output: jitesh@jitesh-desktop:~$ sudo apt-get clean jitesh@jitesh-desktop:~$ sudo apt-get install -f Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: kdelibs5-data The following packages will be upgraded: kdelibs5-data 1 upgraded, 0 newly installed, 0 to remove and 18 not upgraded. 2 not fully installed or removed. Need to get 2,832 kB of archives. After this operation, 2,998 kB disk space will be freed. Do you want to continue [Y/n]? y Get:1 http: //in.archive.ubuntu.com/ubuntu/precise-updates/main kdelibs5-data all 4:4.8.4a-0ubuntu0.2 [2,832 kB] Fetched 2,832 kB in 32s (86.6 kB/s) dpkg: dependency problems prevent configuration of kdelibs5-data: libplasma3 (4:4.8.4a-0ubuntu0.2) breaks kdelibs5-data (<< 4:4.6.80~) and is installed. Version of kdelibs5-data to be configured is 4:4.6.2-0ubuntu4. kate-data (4:4.8.4-0ubuntu0.1) breaks kdelibs5-data (<< 4:4.6.90) and is installed. Version of kdelibs5-data to be configured is 4:4.6.2-0ubuntu4. katepart (4:4.8.4-0ubuntu0.1) breaks kdelibs5-data (<< 4:4.6.90) and is installed. Version of kdelibs5-data to be configured is 4:4.6.2-0ubuntu4. dpkg: error processing kdelibs5-data (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of kdelibs-data: kdelibs-data depends on kdelibs5-data; however: Package kdelibs5-data is not configured yet. No apport report written because MaxReports is reached already dpkg: error processing kdelibs-data (--configure): dependency problems - leaving unconfigured No apport report written because MaxReports is reached already Errors were encountered while processing: kdelibs5-data kdelibs-data W: Duplicate sources.list entry http://archive.canonical.com/ubuntu/ precise/partner i386 Packages (/var/lib/apt/lists/archive.canonical.com_ubuntu_dists_precise_partner_binary-i386_Packages) W: You may want to run apt-get update to correct these problems E: Sub-process /usr/bin/dpkg returned an error code (1) As I thought the error was related to configuring kdelibs, I tried to configure using dpkg. But got the following errors: jitesh@jitesh-desktop:~$ sudo dpkg --configure -a dpkg: dependency problems prevent configuration of kdelibs5-data: libplasma3 (4:4.8.4a-0ubuntu0.2) breaks kdelibs5-data (<< 4:4.6.80~) and is installed. Version of kdelibs5-data to be configured is 4:4.6.2-0ubuntu4. kate-data (4:4.8.4-0ubuntu0.1) breaks kdelibs5-data (<< 4:4.6.90) and is installed. Version of kdelibs5-data to be configured is 4:4.6.2-0ubuntu4. katepart (4:4.8.4-0ubuntu0.1) breaks kdelibs5-data (<< 4:4.6.90) and is installed. Version of kdelibs5-data to be configured is 4:4.6.2-0ubuntu4. dpkg: error processing kdelibs5-data (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of kdelibs-data: kdelibs-data depends on kdelibs5-data; however: Package kdelibs5-data is not configured yet. dpkg: error processing kdelibs-data (--configure): dependency problems - leaving unconfigured Errors were encountered while processing: kdelibs5-data kdelibs-data jitesh@jitesh-desktop:~$ Now I dont have any idea how to proceed. I am unable to install anything from Software Centre or using Terminal now. Some basic info: Core2Duo, dual booting Ubuntu 12.04 with Win7. Fresh install of Ubuntu 12.04 (not upgrade). Incidentally, I had first upgraded from 10.04 and had succesfully installed Amarok 1.4 following this same method. But due to other issues, i had to format and do a clean install of 12.04. Now when I tried to install Amarok 1.4, I'm getting these errors. I also have digiKam and k3b installed, if that can be of any help. I use digiKam a lot, so removing KDE is not feasible for me. Any help on this issue will be highly appreciated. Thanks in advance.

    Read the article

  • Prevent malicious vulnerability scan increasing load on a server

    - by Simon
    Hi all, this week we have been suffering some malicious vulnerability scans to our servers, increasing the load on them, making them nearly unusable. The attack is easy to defend, just blocking the offending ip, but only after discovering it. Is there any form of prevent it? Is it normal that one server becomes nearly unusable due to one of these scans? These are the requests done in just one second to our server: [Fri Mar 12 19:15:27 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/zope trunk 2 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/8872fcacd7663c040f0149ed49f572e9 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/188201 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/74e118780caa0f5232d6ec393b47ae01 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/87d4b821b2b6b9706ba6c2950c0eaefd [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/138917 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/180377 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/182712 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/compl2s [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/e7ba351f0ab1f32b532ec679ac7d589d [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/184530 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/compl_s [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/55542 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/7b9d5a65aab84640c6414a85cae2c6ff [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/77257 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/157611 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/textwrapping [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/51713 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/elina [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/fd4800093500f7a9cc21bea232658706 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/59719 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/administrationexamples [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/29587 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/bdebc9c4aa95b3651e9b8fd90c015327 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/defaultchangenotetext [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/figments [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/69744 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/fastpixelperfect [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/conchmusicsoundtoolkit [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/settingwindowposition [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/windowresizing [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/84784 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/186114 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/99858 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/131677 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/167783 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/99933 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/3en17ljttc [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/gradientcode [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/pythondevelopmentandnavigationwithspe [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/10546 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/167932 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/smallerrectforspritecollision [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/176292 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/3sumvid-19yroldfuckedby2bigcocks [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/67909 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/175185 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/131319 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/99900 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/act5 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/contributors-agreement [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/128447 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/71052 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/114242 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/69768 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/debuggingwithwinpdbfromwithinspe [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/39360 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/176267 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/143468 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/140202 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/25268 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/82241 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/142920 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/downloadingipythonformswindows [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/34367 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/for_collaborators [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/pydeveclipseextensionsfabio [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/usingpdbinipython [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/142264 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/49003 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/gamelets [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/texturecoordinatearithmetic [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/project_interface [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/143177 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/pydeveclipsefabio [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/91525 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/40426 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/134819 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/usingipythonwithtextpad [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/developingpythoninipython [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/35569 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/objfileloader [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/simpleopengl2dclasses [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/191495 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/3dvilla [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/145368 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/140118 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/87799 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/142320 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/glslexample [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/39826 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/cairopygame [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/191338 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/91819 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/152003 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/gllight [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/40567 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/137877 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/188209 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/84577 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/131017 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/fightnight [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/79781 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/4731669 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/161942 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/160289 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/81594 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/12127 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/164452 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/96823 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/163598 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/159190 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-test fsfs+ra_local [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/davros [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-publish logs [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-cleanup [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-test fsfs+ra_svn [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/cdrwin_v3 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/brianpensive [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/x86-openbsd shared gcc [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/roundup-0 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/svcastle [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/56584 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/45934 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-build [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/97194 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/cdrwin_3 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/72243 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/117043 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/147084 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/52713 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/101489 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/134867 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/win32-dependencies [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/36548 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/43827 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/100791 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/elita_posing [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/167848 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/36314 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/49951 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/142740 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/cdromkiteletronicaptg [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/138060 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/68483 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/184474 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/137447 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/sndarray [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/127870 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/167312 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/75411 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/167969 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/surfarray [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/174941 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/59129 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/147554 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/105577 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/91734 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/96679 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/06au [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/124495 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/aah [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/164439 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/12638190 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/eliel [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/171164 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/linearinterpolator [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-test [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/heading_news [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/87778 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/portlet_64568222 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/graphic_ep [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/132230 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/12251 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/greencheese [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/188966 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/cdsonic [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/171522 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/elitewrap [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/184313 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/188079 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/147511 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/160952 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/132581 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/84885 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/graphic_desktop [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/win32-xp vs2005 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/128548 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/92057 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/65235 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/pyscgi [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/56926 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/svcastle-big [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/138553 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/138232 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/153367 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/42315 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/150012 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/160079 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/win32-xp vc60 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/163482 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/42642 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/174458 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/163109 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/spacer_greys [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/pdf_icon16 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/26346 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/190998 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/fforigins [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/aliens-0 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/step-update faad [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/13376 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/52647 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/155036 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/compl2 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/174323 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/42317 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/tsugumo [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/171850 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/184127 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/48321 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/162545 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/84180 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/135901 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/57817 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/6360574 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/124989 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/113314 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/sprite-tutorial [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/14294 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/191387 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/187294 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/178666 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/179653 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/wingide-users [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/16309095 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/169465 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/189399 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/172392 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/35627 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/2670901 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/177847 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/chimplinebyline [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/87518 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/154595 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/12811780 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/cdmenupro42 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/110131 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/95615 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/18464 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/lwedchoice-1999 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/5099582 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/100968 [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/j-emacs [Fri Mar 12 19:15:28 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/0206mathew [Fri Mar 12 19:15:29 2010] [error] [client 213.37.49.231] File does not exist: /var/www/html/10844356 Thanks in advance!

    Read the article

  • Can compressing Program Files save space *and* give a significant boost to SSD performance?

    - by Christopher Galpin
    Considering solid-state disk space is still an expensive resource, compressing large folders has appeal. Thanks to VirtualStore, could Program Files be a case where it might even improve performance? Discovery In particular I have been reading: SSD and NTFS Compression Speed Increase? Does NTFS compression slow SSD/flash performance? Will somebody benchmark whole disk compression (HD,SSD) please? (may have to scroll up) The first link is particularly dreamy, but maybe head a little too far in the clouds. The third link has this sexy semi-log graph (logarithmic scale!). Quote (with notes): Using highly compressable data (IOmeter), you get at most a 30x performance increase [for reads], and at least a 49x performance DECREASE [for writes]. Assuming I interpreted and clarified that sentence correctly, this single user's benchmark has me incredibly interested. Although write performance tanks wretchedly, read performance still soars. It gave me an idea. Idea: VirtualStore It so happens that thanks to sanity saving security features introduced in Windows Vista, write access to certain folders such as Program Files is virtualized for non-administrator processes. Which means, in normal (non-elevated) usage, a program or game's attempt to write data to its install location in Program Files (which is perhaps a poor location) is redirected to %UserProfile%\AppData\Local\VirtualStore, somewhere entirely different. Thus, to my understanding, writes to Program Files should primarily only occur when installing an application. This makes compressing it not only a huge source of space gain, but also a potential candidate for performance gain. Testing The beginning of this post has me a bit timid, it suggests benchmarking NTFS compression on a whole drive is difficult because turning it off "doesn't decompress the objects". However it seems to me the compact command is perfectly capable of doing so for both drives and individual folders. Could it be only marking them for decompression the next time the OS reads from them? I need to find the answer before I begin my own testing.

    Read the article

  • high performance hibernate insert

    - by luke
    I am working on a latency sensitive part of an application, basically i will receive a network event transform the data and then insert all the data into the DB. After profiling i see that basically all my time is spent trying to save the data. here is the code private void insertAllData(Collection<Data> dataItems) { long start_time = System.currentTimeMillis(); long save_time = 0; long commit_time = 0; Transaction tx = null; try { Session s = HibernateSessionFactory.getSession(); s.setCacheMode(CacheMode.IGNORE); s.setFlushMode(FlushMode.NEVER); tx = s.beginTransaction(); for(Data data : dataItems) { s.saveOrUpdate(data); } save_time = System.currentTimeMillis(); tx.commit(); s.flush(); s.clear(); } catch(HibernateException ex) { if(tx != null) tx.rollback(); } commit_time = System.currentTimeMillis(); System.out.println("Save: " + (save_time - start_time)); System.out.println("Commit: " + (commit_time - save_time)); System.out.println(); } The size of the collection is always less than 20. here is the timing data that i see: Save: 27 Commit: 9 Save: 27 Commit: 9 Save: 26 Commit: 9 Save: 36 Commit: 9 Save: 44 Commit: 0 This is confusing to me. I figure that the save should be quick and all the time should be spent on commit. but clearly I'm wrong. I have also tried removing the transaction (its not really necessary) but i saw worse times... I have set hibernate.jdbc.batch_size=20... i need this operation to be as fast as possible, ideally there would only be one roundtrip to the database. How can i do this?

    Read the article

  • Performance question: Inverting an array of pointers in-place vs array of values

    - by Anders
    The background for asking this question is that I am solving a linearized equation system (Ax=b), where A is a matrix (typically of dimension less than 100x100) and x and b are vectors. I am using a direct method, meaning that I first invert A, then find the solution by x=A^(-1)b. This step is repated in an iterative process until convergence. The way I'm doing it now, using a matrix library (MTL4): For every iteration I copy all coeffiecients of A (values) in to the matrix object, then invert. This the easiest and safest option. Using an array of pointers instead: For my particular case, the coefficients of A happen to be updated between each iteration. These coefficients are stored in different variables (some are arrays, some are not). Would there be a potential for performance gain if I set up A as an array containing pointers to these coefficient variables, then inverting A in-place? The nice thing about the last option is that once I have set up the pointers in A before the first iteration, I would not need to copy any values between successive iterations. The values which are pointed to in A would automatically be updated between iterations. So the performance question boils down to this, as I see it: - The matrix inversion process takes roughly the same amount of time, assuming de-referencing of pointers is non-expensive. - The array of pointers does not need the extra memory for matrix A containing values. - The array of pointers option does not have to copy all NxN values of A between each iteration. - The values that are pointed to the array of pointers option are generally NOT ordered in memory. Hopefully, all values lie relatively close in memory, but *A[0][1] is generally not next to *A[0][0] etc. Any comments to this? Will the last remark affect performance negatively, thus weighing up for the positive performance effects?

    Read the article

  • C# Custom Control Properties load before Load and arrays are empties

    - by Wildhorn
    Hello, I made a custom control with custom properties. When I modify these properties in the "Form.cs [Design]", I need stuff to happens (it fill some arrays and modify the look of the control), so I call a function within the "set" of the property. All of this works good. My problem is that when I run the program with my custom control, it seems that properties "set" is called, which will call the function, but then, my arrays seems now to have lost all their values and the function use these arrays, but now because they are all empty, it crashes due to NullException blahblahblah. It also seems that properties "set" is called before the control Load (which I guess is called only when it is added to my Form, and not when the Form load). So question is, why does my arrays become empty once I try to run the Form and is there an event that is called before that when the Form load? Thanks

    Read the article

  • ASP.NET Dynamic Data Deployment Error

    - by rajbk
    You have an ASP.NET 3.5 dynamic data website that works great on your local box. When you deploy it to your production machine and turn on debug, you get the YSD Server Error in '/MyPath/MyApp' Application. Parser Error Description: An error occurred during the parsing of a resource required to service this request. Please review the following specific parse error details and modify your source file appropriately. Parser Error Message: Unknown server tag 'asp:DynamicDataManager'. Source Error: Line 5:  Line 6:  <asp:Content ID="Content1" ContentPlaceHolderID="ContentPlaceHolder1" Runat="Server"> Line 7:      <asp:DynamicDataManager ID="DynamicDataManager1" runat="server" AutoLoadForeignKeys="true" /> Line 8:  Line 9:      <h2><%= table.DisplayName%></h2> Probable Causes The server does not have .NET 3.5 SP1, which includes ASP.NET Dynamic Data, installed. Download it here. The third tagPrefix shown below is missing from web.config <pages> <controls> <add tagPrefix="asp" namespace="System.Web.UI" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add tagPrefix="asp" namespace="System.Web.UI.WebControls" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> <add tagPrefix="asp" namespace="System.Web.DynamicData" assembly="System.Web.DynamicData, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"/> </controls></pages>     Hope that helps!

    Read the article

  • SQL Server 2012 : The Data Tools installer is now available

    - by AaronBertrand
    Last week when RC0 was released, the updated installer for "Juneau" (SQL Server Data Tools) was not available. Depending on how you tried to get it, you either ended up on a blank search page, or a page offering the CTP3 bits. Important note: the CTP3 Juneau bits are not compatible with SQL Server 2012 RC0. If you already have Visual Studio 2010 installed (meaning Standard/Pro/Premium/Ultimate), you will need to install Service Pack 1 before continuing. You can get to the installer simply by opening...(read more)

    Read the article

  • Looking for the best ec2 setup for 3 sites totaling in 1.5 mil in traffic monthly

    - by john h.
    I am looking to consolidate our current aws setup of 2 Large ubuntu ec2 servers and 2 large RDS server for our 3 websites that have a total of about 1.5 million hits a month and increasing every month with the majority of traffic (1 mil) to one forum site in the group and the rest of traffic to an ecommerce site and a small wordpress site. So here is my question/thought? Would it be better for us to combine the two ec2 large servers to just one and same with the 2 RDS servers so we run all three sites off one large ec2 and one RDS. -or- Should we setup maybe 2-3 smaller ec2 servers load balenced and a single RDS. -or- Something completely different setup? One concern is that if one site crashes it takes with it the others. It happened in the past but I am pretty sure its because of the forum software and not the server setup. -john

    Read the article

  • SQL Developer Data Modeler v3.3 Early Adopter: Link Model Objects Across Designs

    - by thatjeffsmith
    The third post in our “What’s New in SQL Developer Data Modeler v3.3” series, SQL Developer Data Modeler now allows you to link objects across models. If you need to catch up on the earlier posts, here are the first two: New and Improved Search Collaborative Design via Excel Today’s post is a very simple and straightforward discussion on how to share objects across models and designs. In previous releases you could easily copy and paste objects between models and designs. Simply select your object, right-click and select ‘Copy’ Once copied, paste it into your other designs and then make changes as required. Once you paste the object, it is no longer associated with the source it was copied from. You are free to make any changes you want in the new location without affecting the source material. And it works the other way as well – make any changes to the source material and the new object is also unaffected. However. What if you want to LINK a model object instead of COPYING it? In version 3.3, you can now do this. Simply drag and drop the object instead of copy and pasting it. Select the object, in this case a relational model table, and drag it to your other model. It’s as simple as it sounds, here’s a little animated GIF to show you what I’m talking about. Drag and drop between models/designs to LINK an object Notes The ‘linked’ object cannot be modified from the destination space Updating the source object will propagate the changes forward to wherever it’s been linked You can drag a linked object to another design, so dragging from A - B and then from B - C will work Linked objects are annotated in the model with a ‘Chain’ bitmap, see below This object has been linked from another design/model and cannot be modified. A very simple feature, but I like the flexibility here. Copy and paste = new independent object. Drag and drop = linked object.

    Read the article

  • Fokedvenc BI és DW blogjaim 7: Oracle Data Warehousing

    - by Fekete Zoltán
    A következo tartalmas blogot ajánlom a nyájas olvasó figyelmébe: The Data Warehouse Insider: http://blogs.oracle.com/datawarehousing/ Az adattárház általános fogalmaitól és a bevezetések és tervezés "best practice" legjobb gyakorlati tapasztalatokig. Témák: csillagsémák, particionálás, OLAP, 3NF, párhuzamos feldolgozások, adatbetöltés, ETL-ELT, adatmodellek, rendezvények, Exadata, Database Machine, tömörítés, adatbányászat, ügyféltörténetek,...

    Read the article

  • Data structures in functional programming

    - by pwny
    I'm currently playing with LISP (particularly Scheme and Clojure) and I'm wondering how typical data structures are dealt with in functional programming languages. For example, let's say I would like to solve a problem using a graph pathfinding algorithm. How would one typically go about representing that graph in a functional programming language (primarily interested in pure functional style that can be applied to LISP)? Would I just forget about graphs altogether and solve the problem some other way?

    Read the article

  • When does 'optimizing code' == 'structuring data'?

    - by NewAlexandria
    A recent article by ycombinator lists a comment with principles of a great programmer. #7. Good programmer: I optimize code. Better programmer: I structure data. Best programmer: What's the difference? Acknowledging subjective and contentious concepts - does anyone have a position on what this means? I do, but I'd like to edit this question later with my thoughts so-as not to predispose the answers.

    Read the article

  • Single Key Multiple Values Data Structure for one to many mapping

    - by nijhawan.saurabh
    Dictionaries are good, they are great to store Key / Value pairs but what if you want to store multiple values for a single key? Dictionaries would not allow duplicate keys. I came across a nice way to represent such a Data Structure using one of the Extension Method (ToLookup) present in System.Linq Namespace which converts an IEnumerable<T> to an ILookup<TKey, TElement>.   Now, there are two parameters this method expects (The other overload expects 3 parameters): IEnumerable<TSource> - This list would contain the actual data. Func<TSource, TKey> keySelector - The Delegate which which computes the keys   The method returns the following: ILookup<TKey, TElement>   This DS would store Keys and multiple values along those keys.   Let's see a small example:        12  using System;    13     using System.Collections.Generic;    14     using System.Linq;    15     16     /// <summary>    17     /// </summary>    18     internal class Program    19     {    20         #region Methods    21     22         /// <summary>    23         /// </summary>    24         /// <param name="args">    25         /// The args.    26         /// </param>    27         private static void Main(string[] args)    28         {    29             // Create an array of strings.    30             var list = new List<string> { "IceCream1", "Chocolate Moose", "IceCream2" };    31     32             // Generate a lookup Data Structure    33             ILookup<int, string> lookupDs = list.ToLookup(item => item.Length);    34     35           // Enumerate groupings.    36             foreach (var group in lookupDs)    37             {    38                 foreach (string element in group)    39                 {    40                     Console.WriteLine(element);    41                 }    42             }    43         } Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Data Education: Great Classes Coming to a City Near You

    - by Adam Machanic
    In case you haven't noticed, Data Education (the training company I started a couple of years ago) has expanded beyond the US northeast; we're currently offering courses with top trainers in both St. Louis and Chicago , as well as the Boston area. The courses are starting to fill up fast—not surprising when you consider we’re talking about experienced instructors like Kalen Delaney , Rob Farley , and Allan Hirt —but we have still have some room. We’re very excited about bringing the highest quality...(read more)

    Read the article

  • Extracting the Layout of all the Data Forms from the Relational Database

    - by RahulS
    Today I came across a question from one of our clients that: "what members are used on each data form WITHOUT having to go through the report generated out of our Planning app". We worked with client on this and reached to a simple query. All the form related information is stored in the following tables: HSP_FORM HSP_FORMOBJ_DEF HSP_FORMOBJ_DEF_MBR HSP_FORM_ATTRIBUTES HSP_FORM_CALCS HSP_FORM_DV_CONDITION HSP_FORM_DV_PM_RULE HSP_FORM_DV_RULE HSP_FORM_DV_USER_IN_PM_RULE HSP_FORM_LAYOUT HSP_FORM_MENUS HSP_FORM_VARIABLES If we want to retrieve just the members included, we can concentrate on: HSP_OBJECT to get the Object_ID for form, Object_Type is 7 for forms. (Ex: Select * from HSP_OBJECT where OBJECT_TYPE = 7) HSP_FORMOBJ_DEF Find the OBJDEF_ID for a particular form HSP_FORMOBJ_DEF_MBR Use the above OBJDEF_ID to find the members: Here the Mbr_ID is the Id of the member and Query_Type is the Function like Idesc, Level0 etc and Sequce is you sequence, And the final table we can use is HSP_FORM_LAYOUT: Layout_Type: 0->Pov 1-> Page, 2->Row, 3->Col, DIM_ID is the dimension ID and Ordinal is position. Here is the Query: SELECT HSP_OBJECT.OBJECT_NAME AS 'Form',  HSP_OBJECT_2.OBJECT_NAME AS 'Dimension',  HSP_OBJECT_1.OBJECT_NAME AS 'Member',  HSP_FORMOBJ_DEF_MBR.QUERY_TYPE FROM  <DatabaseName>.dbo.HSP_FORM_LAYOUT HSP_FORM_LAYOUT,  <DatabaseName>.dbo.HSP_FORMOBJ_DEF HSP_FORMOBJ_DEF,  <DatabaseName>.dbo.HSP_FORMOBJ_DEF_MBR HSP_FORMOBJ_DEF_MBR,  <DatabaseName>.dbo.HSP_MEMBER HSP_MEMBER,  <DatabaseName>.dbo.HSP_OBJECT HSP_OBJECT,  <DatabaseName>.dbo.HSP_OBJECT HSP_OBJECT_1,  <DatabaseName>.dbo.HSP_OBJECT HSP_OBJECT_2 WHERE  HSP_OBJECT.OBJECT_ID = HSP_FORMOBJ_DEF.FORM_ID AND  HSP_FORMOBJ_DEF_MBR.OBJDEF_ID = HSP_FORMOBJ_DEF.OBJDEF_ID AND  HSP_MEMBER.MEMBER_ID = HSP_FORMOBJ_DEF_MBR.MBR_ID AND  HSP_OBJECT_1.OBJECT_ID = HSP_MEMBER.MEMBER_ID AND  HSP_OBJECT_2.OBJECT_ID = HSP_MEMBER.DIM_ID AND  HSP_FORM_LAYOUT.DIM_ID = HSP_MEMBER.DIM_ID AND  HSP_FORM_LAYOUT.FORM_ID = HSP_FORMOBJ_DEF.FORM_ID AND  ((HSP_OBJECT.OBJECT_TYPE=7)) ORDER BY HSP_OBJECT.OBJECT_NAME  Concentrate on Test1 data form and Actual Layout of it as follows: Corresponding Query_type for few of the functions: 9  for Idesc, 3  for Ancestors, -9 for ILvl0Des, 8  for Desc, 4  for IAncestors Its just a basic idea you can do lot on the basis of this. Cheers..!!! Rahul S. http://www.facebook.com/pages/HyperionPlanning/117320818374228

    Read the article

  • Windows replacement for HAProxy

    - by GrayWizardx
    It does not appear that there is a similar question to this already posted, so I will go ahead and ask. I am working on a project that could benefit from having two - four servers handling incoming requests to a backend webservice. The service does not require SSL but does need to support occassional long running processes (upto 120 secs). This project does not at present have the funding to purchase a hardware load balancing solution. I have previously used HAProxy as a solution for this, and found it very simple and straightforward. Is there a similar product for windows (server 2003 or 2008) which provides similar configuration options and runs as a lightweight service? For reasons outside my control I cannot setup a Linux machine (physical or virtual) and so I am looking for behaviour that can be deployed on a windows machine. I can only find Perlbal which appears to fall into this category. So as not to keep this open indifenitely I will give credit to the only answer.

    Read the article

  • Sorting data in the SSIS Pipeline (Video)

    In this post I want to show a couple of ways to order the data that comes into the pipeline.  a number of people have asked me about this primarily because there are a number of ways to do it but also because some components in the pipeline take sorted inputs.  One of the methods I show is visually easy to understand and the other is less visual but potentially more performant.

    Read the article

  • Performance triage

    - by Dave
    Folks often ask me how to approach a suspected performance issue. My personal strategy is informed by the fact that I work on concurrency issues. (When you have a hammer everything looks like a nail, but I'll try to keep this general). A good starting point is to ask yourself if the observed performance matches your expectations. Expectations might be derived from known system performance limits, prototypes, and other software or environments that are comparable to your particular system-under-test. Some simple comparisons and microbenchmarks can be useful at this stage. It's also useful to write some very simple programs to validate some of the reported or expected system limits. Can that disk controller really tolerate and sustain 500 reads per second? To reduce the number of confounding factors it's better to try to answer that question with a very simple targeted program. And finally, nothing beats having familiarity with the technologies that underlying your particular layer. On the topic of confounding factors, as our technology stacks become deeper and less transparent, we often find our own technology working against us in some unexpected way to choke performance rather than simply running into some fundamental system limit. A good example is the warm-up time needed by just-in-time compilers in Java Virtual Machines. I won't delve too far into that particular hole except to say that it's rare to find good benchmarks and methodology for java code. Another example is power management on x86. Power management is great, but it can take a while for the CPUs to throttle up from low(er) frequencies to full throttle. And while I love "turbo" mode, it makes benchmarking applications with multiple threads a chore as you have to remember to turn it off and then back on otherwise short single-threaded runs may look abnormally fast compared to runs with higher thread counts. In general for performance characterization I disable turbo mode and fix the power governor at "performance" state. Another source of complexity is the scheduler, which I've discussed in prior blog entries. Lets say I have a running application and I want to better understand its behavior and performance. We'll presume it's warmed up, is under load, and is an execution mode representative of what we think the norm would be. It should be in steady-state, if a steady-state mode even exists. On Solaris the very first thing I'll do is take a set of "pstack" samples. Pstack briefly stops the process and walks each of the stacks, reporting symbolic information (if available) for each frame. For Java, pstack has been augmented to understand java frames, and even report inlining. A few pstack samples can provide powerful insight into what's actually going on inside the program. You'll be able to see calling patterns, which threads are blocked on what system calls or synchronization constructs, memory allocation, etc. If your code is CPU-bound then you'll get a good sense where the cycles are being spent. (I should caution that normal C/C++ inlining can diffuse an otherwise "hot" method into other methods. This is a rare instance where pstack sampling might not immediately point to the key problem). At this point you'll need to reconcile what you're seeing with pstack and your mental model of what you think the program should be doing. They're often rather different. And generally if there's a key performance issue, you'll spot it with a moderate number of samples. I'll also use OS-level observability tools to lock for the existence of bottlenecks where threads contend for locks; other situations where threads are blocked; and the distribution of threads over the system. On Solaris some good tools are mpstat and too a lesser degree, vmstat. Try running "mpstat -a 5" in one window while the application program runs concurrently. One key measure is the voluntary context switch rate "vctx" or "csw" which reflects threads descheduling themselves. It's also good to look at the user; system; and idle CPU percentages. This can give a broad but useful understanding if your threads are mostly parked or mostly running. For instance if your program makes heavy use of malloc/free, then it might be the case you're contending on the central malloc lock in the default allocator. In that case you'd see malloc calling lock in the stack traces, observe a high csw/vctx rate as threads block for the malloc lock, and your "usr" time would be less than expected. Solaris dtrace is a wonderful and invaluable performance tool as well, but in a sense you have to frame and articulate a meaningful and specific question to get a useful answer, so I tend not to use it for first-order screening of problems. It's also most effective for OS and software-level performance issues as opposed to HW-level issues. For that reason I recommend mpstat & pstack as my the 1st step in performance triage. If some other OS-level issue is evident then it's good to switch to dtrace to drill more deeply into the problem. Only after I've ruled out OS-level issues do I switch to using hardware performance counters to look for architectural impediments.

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >