Search Results

Search found 1668 results on 67 pages for 'legacy'.

Page 44/67 | < Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >

  • Get element using id problem

    - by Ebe
    I am modifying existing legacy webpages(which i shouldn't modify existing parts but adding) and the webpage uses document.write to write certain html elements. when i use <script type="text/javascript" language="javascript"> var v = document.getElementById('td_date_cal_0'); alert(v); </script> v becomes null and when i create button and click <input type="button" id="mnbutton" onclick="mnLoader();" value="Click Me!" />; <script type="text/javascript" language="javascript" > function mnLoader() { var v = document.getElementById("td_date_cal_0"); alert(v); } <br /> </script> Any idea, how to get the element without need of user action, such as clicking? Thanks, Ebe

    Read the article

  • Tool to convert inline C' into a code behind

    - by Jon Jones
    Hi I have a number of legacy web controls (ascx) that contains huge amounts of inline C#. The forms contain a number of repeated and duplicate code. Our first plan is to move the code into code behinds per file, then refactor etc... were doing this to upgrade the client to the latest version of their cms At the moment we are going to have to manually copy and paste hundreds files, convert the namespace client-side imports into usings, etc... does anybody PLEASE know of a tool that can do the majority of this work for us ? Thanks

    Read the article

  • Why does derivative trading position always require C++ knowledge?

    - by Jeffrey
    I’ve never worked in trading environment before and I was curious to see that few of the trading houses seem to use C# but most of them do heavily rely on C++. Why is it? Is it because C++ is better performance wise? Is it because of legacy code base? Is it because cross platform issue? What about dynamic languages (ruby, python)? Are they too slow for this kind of work in terms of performance? Updated: If realibility and performance are important would "Erlang" be the "next big thing" in trading platform?

    Read the article

  • Get Rails to save a record to the database in a non-UTC time

    - by Shaun
    Is there a way to get Rails to save records to the database without it automagically converting the timestamp into UTC before saving? The problem is that I have a few models that pull data from a legacy database that saves everything in Mountain Time and occasionally I have to have my Rails app write to that database. The problem is that every time it does, it converts the time I give it from Mountain Time to UTC, which is 6-7 hours ahead (depending on DST)! Needless to say, this really messes with reporting on that database. If I could get around doing this, I would. Unfortunately, I can't do anything about the fact that this other database uses a different timezone, nor can I really get away from the need for this app to save to that database occasionally. If I could just get Rails to stop trying to help me, it'd be great.

    Read the article

  • Unit testing with serialization mock objects in C++

    - by lhumongous
    Greetings, I'm fairly new to TDD and ran across a unit test that I'm not entirely sure how to address. Basically, I'm testing a couple of legacy class methods which read/write a binary stream to a file. The class functions take a serializable object as a parameter, which handles the actual reading/writing to the file. For testing this, I was thinking that I would need a serialization mock object that I would pass to this function. My initial thought was to have the mock object hold onto a (char*) which would dynamically allocate memory and memcpy the data. However, it seems like the mock object might be doing too much work, and might be beyond the scope of this particular test. Is my initial approach correct, or can anyone think of another way of correctly testing this? Thanks!

    Read the article

  • Embeddable unit testing framework for mixed Windows app

    - by Andy Dent
    I want to test portions of a very complex app which includes both a major native Windows component and a substantial WPF GUI. Due to complexities I can't detail, it is impossible to run the native portion independently nor can I isolate the areas I want to test (spare me the lectures, we're talking a huge legacy code base and we do have refactoring plans). I'm looking for a unit test kit I can invoke on the native side but must be able to run with the app launched with the managed portion initialised. That seems to rule out the run executable feature of the cfix Windows unit test kit. I really like their philosophy, like WinUnit, of using DLL compilation as a way to add the reflective capabilities missing in C++ and gain a more NUnit-like experience. Ideally, I want something like WinUnit running within the application code and generating an HTML report. I'm trying to introduce more TDD and having things as lean as possible is important.

    Read the article

  • Can I use WCF on Visual Studio 2005?

    - by Hemant
    I am about to start a project which consumes third party web services. Because of a legacy system, I am told that I can only use Visual Studio 2005/.NET 2.0. (Though I would have preferred Visual Studio 2008 on .NET 3.5) My understanding is that WCF was released with .NET 3.0. So is there any possibility to use WCF on Visual Studio 2005 by using just the WCF assemblies of .NET 3.0? I will then try to convince them that it is just like using external framework which doesn't disturb anything.

    Read the article

  • storing huge amount of records into classic asp cache object is SLOW

    - by aspm
    we have some nasty legacy asp that is performing like a dog and i narrowed it down to because we are trying to store 15K+ records into the application cache object. but that's not the killer. before it stores it, it converts the ADO stream to XML then stores it. this conversion of the huge record set to XML spikes the CPU and causes all kinds of havoc on users when it's happening. and unfortunately we do this XML conversion to read the cache a lot, causing site wide performance problems. i don't have the resources to convert everything to .net. so that's out. but i need to obviously use caching, but int his case the caching is hurting instead of helping. is there a more effecient way to store this data instead of doing this xml conversion to/from every time we read/update the cache?

    Read the article

  • Visual C++ 2008: Finding the cause of slow link times

    - by ckarras
    I have a legacy C++ project that takes an annoyingly long time to build (several minutes, even for small incremental changes), and I found most of the time was spent linking. The project is already using precompiled headers and incremental compilation. I have enabled the "/time" command line parameter in the hope I would get more details about what is slowing the linker, and got the following output: 1>Linking... 1> MD Merge: Total time = 59.938s 1> Generate Transitions: Total time = 0.500s 1> MD Finalize: Total time = 7.328s 1>Pass 1: Interval #1, time = 71.718s 1>Pass 2: Interval #2, time = 8.969s 1>Final: Total time = 80.687s 1>Final: Total time = 80.953s Is there a way to get more details about each of these steps? For example, I would like to find if they are spending most time linking to a specific .lib or .obj file. Also, is there any documentation that explains what each of these steps do?

    Read the article

  • Choosing between assembler and COBOL

    - by Azares Cob
    I have to rewrite and greatly modify parts of a legacy COBOL application. The COBOL source-code is available (around 100.000 lines of copy & pasted code mixed with GOTOs). Some more details on the system: It is a general management system controlling transactions, bank management, customer data and employees of the company I work for. The COBOL-powered database is about 4 Terabytes distributed over 50 old HDDs. (But messing around with them is the sysadmins job) They are using COBOL85 only. Now I have two options: Rewrite and refactor 50% of the old COBOL system, or use X86 assembly. Should I use X86 assembler or COBOL?

    Read the article

  • PL/SQL exception and Java programs

    - by edwards
    Hi Business logic is coded in pl/sql packages procedures and functions. Java programs call pl/sql packages procedures and functions to do database work. pl/sql programs store exceptions into Oracle tables whenever an exception is raised. How would my java programs get the exceptions since the exception instead of being propagated from pl/sql to java is getting persisted to a oracle table and the procs/functions just return 1 or 0. Sorry folks i should have added this constraint much earlier and avoided this confusion. As with many legacy projects we don't have the freedom to modify the stored procedures.

    Read the article

  • Usual hibernate performance pitfall

    - by Antoine Claval
    Hi, We have just finish to profile our application. ( she's begin to be slow ). the problem seems to be "in hibernate". It's a legacy mapping. Who work's, and do it's job. The relational shema behind is ok too. But some request are slow as hell. So, we would appreciate any input on common and usual mistake made with hibernate who end up with slow response. Exemple : Eager in place of Lazy can change dramaticly the response time....

    Read the article

  • C# StandardInput Sending Mofidiers

    - by Paul Oakham
    Hi All, We have some legacy software which depends on sending keystrokes to a DOS window and then scraping the screen. I am trying to re-create the software by redirecting the input and output streams of the process directly to my application. This part I have managed fine using: _Process = new Process(); { _Process.StartInfo.FileName = APPLICATION; _Process.StartInfo.RedirectStandardOutput = true; _Process.StartInfo.RedirectStandardInput = true; _Process.StartInfo.RedirectStandardError = true; _Process.StartInfo.UseShellExecute = false; _Process.StartInfo.CreateNoWindow = true; _Process.OutputDataReceived += new DataReceivedEventHandler(_Process_OutputDataReceived); _Process.ErrorDataReceived += new DataReceivedEventHandler(_Process_ErrorDataReceived); } My problem is I need to send some command modifiers such as Ctrl, ALT and Space as well as F1-12 to this process but am unsure how. I can send basic text and I receive response's fine. I just need to emulate these modifiers. Any help would be great, Cheers

    Read the article

  • How is an SOA architecture really supposed to be implemented?

    - by smaye81
    My project is converting a legacy fat-client desktop application into the web. The database is not changing as a result. Consequently, we are being forced to call external web services to access data in our own database. Couple this with the fact that some parts of our application are allowed to access the database directly through DAOs (a practice that is much faster and easier). The functionality we're supposed to call web services for are what has been deemed necessary for downstream, dependent systems. Is this really how SOA is supposed to work? Admittedly, this is my first foray into the SOA world, but I have to think this is the complete wrong way to go about this.

    Read the article

  • Is there a good collection library for C-language?

    - by matti
    We have to maintain and even develop C-code of our legacy system. Is there good collection library that would support Java/C# (new versions) style collections. Hashtable, HashSet, etc. Of course without objects, but with structs. The HashTable key limitations to "strings" and ints is not a problem. It wouldn't be bad if it's free even for commercial use. I'm back to C from C# and I must say i'm depressed using our own libraries and the language in general. We're using VS2005 and MS C-compiler if that has nothing to do with anything. Thanks & BR -Matti

    Read the article

  • Can I automatically overwrite repository files using svn_load_dirs.pl or similiar?

    - by Andy Strang
    I am working with a legacy VSS repository which was transferred over to a new SVN repository a few months ago. In the meantime, before we go live with the SVN repository, we need to bring over all the changes that have happened on the VSS one between then and now. I was looking at different ways to do this which seem to be things such as: 1.) svn_load_dirs.pl then merge the files manually? 2.) svn import straight into the trunk and merge files manually 3.) checkout a working copy of my SVN repository, copy in the changed files which will overwrite some of the ones in my working copy then commit the changes. My question is, can any of these options be used (or any other options) to automate things so that I don't have to merge the files, and can instead just overwrite them? I think only Option 3 would do this but any help is appreciated.

    Read the article

  • Are there any context-sensitive code search tools?

    - by Vicky
    I have been getting very frustrated recently in dealing with a massive bulk of legacy code which I am trying to get familiar with. Say I try to search for a particular function call, I get loads of results that turn out to be completely irrelevant; some of them are easy to spot, eg a comment saying // Fixed functionality in foo() so don't need to handle this here any more But others are much harder to spot manually, because they turn out to be calls from other functions in modules that are only compiled in certain cases, or are part of a much larger block of code that is #if 0'd out in its entirety. What I'd like would be a search tool that would allow me to search for a term and give me the choice to include or exclude commented out or #if 0'd out code. Then the search results would be displayed alongside a list of #defines that are required in order for that snippet of code to be relevant. I'm working in C / C++, but other than the specific comment syntax I guess the techniques should be more generally applicable. Does such a tool exist?

    Read the article

  • Rails ActiveRecord - How to set association save order

    - by Altonymous
    I have a weird relationship that needs to be maintained for legacy processes. I'm trying to figure out how to create the relationship given the new model association. New Relationship Setup Machine has_many MachineReadings has_many Disks has_many DiskReadings Old Relationship Setup Machine has_many MachineReadings has_many DiskReadings has_many Disks The problem is data will come in on the Machine model as nested attributes using the new relationship setup. I need to update the machine_reading_id in the DiskReading model so the old association can continue to be used. I tried doing this via an after_save hook that would traverse back up to the machine and then down to the readings to get the machine_reading.id so I could populate the DiskReading model. However, the associations aren't being saved in the order I would expect. They are saving the Disks & DiskReadings before saving the MachineReadings. So when I go after the machine_reading.id it hasn't been written and thus I am unable to get access to it. For example: #machine_disk_reading.rb after_save :build_old_relationship def build_old_relationship self.machine_reading_id = self.disk.machine.readings.find_by_date_time(self.date_time).id end

    Read the article

  • What's the best way to write to more files than the kernel allows open at a time?

    - by Elpezmuerto
    I have a very large binary file and I need to create separate files based on the id within the input file. There are 146 output files and I am using cstdlib and fopen and fwrite. FOPEN_MAX is 20, so I can't keep all 146 output files open at the same time. I also want to minimize the number of times I open and close an output file. How can I write to the output files effectively? I also must use the cstdlib library due to legacy code.

    Read the article

  • Help with porting thread functionality: Win32 --> .Net

    - by JimDaniel
    Hi, I am responsible for porting a class from legacy Win32 code to .Net and I have come across a threading model that I'm not sure how best to implement in .Net. Basically the Win32 has one worker thread, which calls WaitForMultipleObjects() and executes the particular piece of code when a particular object has been triggered. This has a sort of first-come-first-serve effect that I need to emulate in my own code. But I'm not sure how best to do this in .Net. Does anyone have any idea? I see that there is no equivalent of WaitForMultipleObjects() in .Net, only the ThreadPool class, which seems to provide most of what I need, but I'm not sure if it's the best, since I only have four objects total to wait and execute code for. Thanks, Daniel

    Read the article

  • ActionScript 3 Context Menu Per Sprite?

    - by TheDarkIn1978
    is it not possible to have different context menus for different sprites on the stage? i've tried adding a custom context menu to a sprite but it's applied to the entire stage: mySprite.contextMenu = myMenu; then after reading the documentation where it states: You can attach a ContextMenu object to a specific button, movie clip, or text field object, or to an entire movie level. You use the menu property of the Button, MovieClip, or TextField class to do this. ok, so i though i had to write it like: mySprite.menu.contextMenu = myMenu; only to be greeted with a nice migration issue stating that menu is legacy code and to use contextMenu instead. ??? um, thanks for the headsup, documentation. this process would be entirely much more easier if i could extend the ContextMenu, but for some reason it's marked as "final" and can't be extended... i'm sure adobe's reasons for finalizing the context menu class are as good as their reasons for including misleading documentation. thoughts?

    Read the article

  • How to look up an NHibernate entity's table mapping from the type of the entity?

    - by snicker
    Once I've mapped my domain in NHibernate, how can I reverse lookup those mappings somewhere else in my code? Example: The entity Pony is mapped to a table named "AAZF1203" for some reason. (Stupid legacy database table names!) I want to find out that table name from the NH mappings using only the typeof(Pony) because I have to write a query elsewhere. How can I make the following test pass? private const string LegacyPonyTableName = "AAZF1203"; [Test] public void MakeSureThatThePonyEntityIsMappedToCorrectTable() { string ponyTable = GetNHibernateTableMappingFor(typeof(Pony)); Assert.AreEqual(LegacyPonyTableName, ponyTable); } In other words, what does the GetNHibernateTableMappingFor(Type t) need to look like?

    Read the article

  • How do I style jQuery Combobox to look like a normal select dropdown

    - by Joe T
    Hi everyone. I am making use of jQuery 1.4.4 and jQuery UI 1.8.7 in a legacy code base. I have added a few combobox() ui-widgets, these comboboxes live along side normal <select> dropdowns. I am finding it difficult to get the two elements to look the same, i.e. the unstyled <select> with it's Internet Explorer / Windows based style and the most basic jQuery UI Themed ui-widget combobox. The screenshot shows an unstyled <select> on the left and a jQuery ui-widget combobox on the right: Is it possible to make the two look the same?

    Read the article

  • How to represent a Many-To-Many relationship in XML or other simple file format?

    - by CSharperWithJava
    I have a list management appliaction that stores its data in a many-to-many relationship database. I.E. A note can be in any number of lists, and a list can have any number of notes. I also can export this data to and XML file and import it in another instance of my app for sharing lists between users. However, this is based on a legacy system where the list to note relationship was one-to-many (ideal for XML). Now a note that is in multiple lists is esentially split into two identical rows in the DB and all relation between them is lost. Question: How can I represent this many-to-many relationship in a simple, standard file format? (Preferably XML to maintain backwards compatibility)

    Read the article

  • CAD/CAM without C++

    - by zaladane
    Hello, Is it possible to do CAD/CAM software without having to use C++? My company developed their software with c/C++ but that was more than 10 years ago. Today,there is a lot of legacy code that switching would force us to get rid of but i was wondering what the actual risks are. We have a lot of mathematical algorithms for toolpath calculations, feature recognition and simulation and 3D Rendering and i was wondering if C# can handles all of that without great performance loss. Is it a utopia to rewrite such algorithms in c# or should that language only deal with UI. We are not talking about game development here (Halo 3 or Call Of Duty) so how much processing does CAD/CAM really need? Can anybody enlighten me on this matter? Most of my colleagues are hardcore C++ programmers and although i program in c++ i love .NET but i am having a hard time selling .NET to them other than basic UI. Does it make sense to consider switching to .NET in such a field, or is it just not a wise idea? Thank you

    Read the article

< Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >