Search Results

Search found 13318 results on 533 pages for 'tfs to tfs migration tool'.

Page 67/533 | < Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >

  • Webbased data modelling and management tool

    - by pixeldude
    Is there a web-based tool available, where I am able to... ...define data models (like in a database admin tool) ...fill in data (in custom web forms, not too generic) with basic features like completion ...import data from CSV oder Excel Sheets ...export data to CSV or SQL ...create snapshots of my data models (versions, diff, etc.) ...share my data models ...discuss/collaborate with other people about my data models Well, I can develop something like this in PHP or with Ruby or whatever. But this is such a common task, where the application support could be a lot better. And it would be language and database independent. This would help to maintain data models in different versions and you can maybe share your data models with others, extend it with your team members, etc. There is a website called FreeBase, which allows you to define a data entity model and fill in data, which also has export features, but I need to define my own data model with my own granularity and structure. And it should not be shared in public if I don't want to. How do you solve problems like this yourself?

    Read the article

  • Tool for generating flat files from SQL objects dynamically

    - by Fabio Gouw
    Hello! I'm looking for a tool or component that generates flat files given a SQL Server's query result (from a stored procedure or a SELECT * over a table or view). This will be a batch process which runs every day, and every day a new file is created. I could use SQL Server Integration Services (DTS), but I have a mandatory requirement: the output of the file need to be dynamic. If a new column is added in my query result, the file must have this new column too, without having to modify my SSIS package. If a column is removed, then the flat file no longer will have it. I’ve tried to do this with SSIS, but when I create a new package I need to specify the number of columns. Another requirement is configuring the format of the output, depending on the data type of the column. If it’s a datetime, the format needs to be YYYY-MM-DD. If it’s a float, then I need to use 2 decimal digits, and so on. Does anyone know a tool that does this job? Thanks

    Read the article

  • G++ Multi-platform memory leak detection tool

    - by indyK1ng
    Does anyone know where I can find a memory memory leak detection tool for C++ which can be either run in a command line or as an Eclipse plug-in in Windows and Linux. I would like it to be easy to use. Preferably one that doesn't overwrite new(), delete(), malloc() or free(). Something like GDB if its gonna be in the command line, but I don't remember that being used for detecting memory leaks. If there is a unit testing framework which does this automatically, that would be great. This question is similar to other questions (such as http://stackoverflow.com/questions/283726/memory-leak-detection-under-windows-for-gnu-c-c ) however I feel it is different because those ask for windows specific solutions or have solutions which I would rather avoid. I feel I am looking for something a bit more specific here. Suggestions don't have to fulfill all requirements, but as many as possible would be nice. Thanks. EDIT: Since this has come up, by "overwrite" I mean anything which requires me to #include a library or which otherwise changes how C++ compiles my code, if it does this at run time so that running the code in a different environment won't affect anything that would be great. Also, unfortunately, I don't have a Mac, so any suggestions for that are unhelpful, but thank you for trying. My desktop runs Windows (I have Linux installed but my dual monitors don't work with it) and I'd rather not run Linux in a VM, although that is certainly an option. My laptop runs Linux, so I can use that tool on there, although I would definitely prefer sticking to my desktop as the screen space is excellent for keeping all of the design documentation and requirements in view without having to move too much around on the desktop. NOTE: While I may try answers, I won't mark one as accepted until I have tried the suggestion and it is satisfactory. EDIT2: I'm not worried about the cross-platform compatibility of my code, it's a command line application using just the C++ libraries.

    Read the article

  • reporting tool/viewer for large datasets

    - by FrustratedWithFormsDesigner
    I have a data processing system that generates very large reports on the data it processes. By "large" I mean that a "small" execution of this system produces about 30 MB of reporting data when dumped into a CSV file and a large dataset is about 130-150 MB (I'm sure someone out there has a bigger idea of "large" but that's not the point... ;) Excel has the ideal interface for the report consumers in the form of its Data Lists: users can filter and segment the data on-the-fly to see the specific details that they are interested in - they can also add notes and markup to the reports, create charts, graphs, etc... They know how to do all this and it's much easier to let them do it if we just give them the data. Excel was great for the small test datasets, but it cannot handle these large ones. Does anyone know of a tool that can provide a similar interface as Excel data lists, but that can handle much larger files? The next tool I tried was MS Access, and found that the Access file bloats hugely (30 MB input file leads to about 70 MB Access file, and when I open the file, run a report and close it the file's at 120-150 MB!), the import process is slow and very manual (currently, the CSV files are created by the same plsql script that runs the main process so there's next to no intervention on my part). I also tried an Access database with linked tables to the database tables that store the report data and that was many times slower (for some reason, sqlplus could query and generate the report file in a minute or soe while Access would take anywhere from 2-5 minutes for the same data) (If it helps, the data processing system is written in PL/SQL and runs on Oracle 10g.)

    Read the article

  • WordPress permalinks not working, everything seems fine

    - by javipas
    I have a WordPress blog I've migrated from another CMS, and I've being having a lot of problems with my permalinks structure: lots of articles give a 404, although they are there, somewhere, published. The site is www.muycomputerpro.com (MCP for short), and for example an article that should be found is: http://muycomputerpro.com/Actualidad/Especiales/2009-las-grandes-crecen-en-la-bolsa If I do a search on the search tool at MCP, the result is there (see EnlacesMCP-1.jpg) But when I click on the link, our 404 error page appears (see EnlacesMCP-2.jpg) The weird thing is, the article is published, and the permalink is the right one, as you can see on this screenshot of the WordPress CMS: The permalink (below the title) is correct (http://www.muycomputerpro.com/Actualidad/Especiales/2009-las-grandes-crecen-en-la-bolsa/) but it does not work. In fact, if I try to use the short link (http://www.muycomputerpro.com/?p=5023) the article does not show either. I've accessed my WordPress DB and I've search the article to see if there is something wrong there, but from what I can tell all the fields are OK, here's a screenshot: I really don't know what is causing this. The permalink structure should work (I'm using the "Custom permalink" plugin to preserve the old URLs that had a alphanumeric code at the end of the postname) and the permalink config on wordpress is "/%postname%/". I really need help :(

    Read the article

  • CSC folder data access AND roaming profiles issues (Vista with Server 2003, then 2008)

    - by Alex Jones
    I'm a junior sysadmin for an IT contractor that helps small, local government agencies, like little towns and the like. One of our clients, a public library with ~ 50 staff users, was recently migrated from Server 2003 Standard to Server 2008 R2 Standard in a very short timeframe; our senior employee, the only network engineer, had suddenly put in his two weeks notice, so management pushed him to do this project before quitting. A bit hasty on management's part? Perhaps. Could we do anything about that? Nope. Do I have to fix this all by myself? Pretty much. The network is set up like this: a) 50ish staff workstations, all running Vista Business SP2. All staff use MS Outlook, which uses RPC-over-HTTPS ("Outlook Anywhere") for cached Exchange access to an offsite location. b) One new (virtualized) Server 2008 R2 Standard instance, running atop a Server 2008 R2 host via Hyper-V. The VM is the domain's DC, and also the site's one and only file server. Let's call that VM "NEWBOX". c) One old physical Server 2003 Standard server, running the same roles. Let's call it "OLDBOX". It's still on the network and accessible, but it's been demoted, and its shares have been disabled. No data has been deleted. c) Gigabit Ethernet everywhere. The organization's only has one domain, and it did not change during the migration. d) Most users were set up for a combo of redirected folders + offline files, but some older employees who had been with the organization a long time are still on roaming profiles. To sum up: the servers in question handle user accounts and files, nothing else (eg, no TS, no mail, no IIS, etc.) I have two major problems I'm hoping you can help me with: 1) Even though all domain users have had their redirected folders moved to the new server, and loggin in to their workstations and testing confirms that the Documents/Music/Whatever folders point to the new paths, it appears some users (not laptops or anything either!) had been working offline from OLDBOX for a long time, and nobody realized it. Here's the ugly implication: a bunch of their data now lives only in their CSC folders, because they can't access the share on OLDBOX and sync with it finally. How do I get this data out of those CSC folders, and onto NEWBOX? 2) What's the best way to migrate roaming profile users to non-roaming ones, without losing vital data like documents, any lingering PSTs, etc? Things I've thought about trying: For problem 1: a) Reenable the documents share on OLDBOX, force an Offline Files sync for ALL domain users, then copy OLDBOX's share's data to the equivalent share on NEWBOX. Reinitialize the Offline Files cache for every user. With this: How do I safely force a domain-wide Offline Files sync? Could I lose data by reenabling the share on OLDBOX and forcing the sync? Afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? b) Determine which users have unsynced changes to OLDBOX (again, how?), search each user's CSC folder domain-wide via workstation admin shares, and grab the unsynched data. Reinitialize the Offline Files cache for every user. With this: How can I detect which users have unsynched changes with a script? How can I search each user's CSC folder, when the ownership and permissions set for CSC folders are so restrictive? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? c) Manually visit each workstation, copy the contents of the CSC folder, and manually copy that data onto NEWBOX. Reinitialize the Offline Files cache for every user. With this: Again, how do I 'break into' the CSC folder and get to its data? As an experiment, I took one workstation's HD offsite, imaged it for safety, and then tried the following with one of our shop PCs, after attaching the drive: grant myself full control of the folder (failed), grant myself ownership of the folder (failed), run chkdsk on the whole drive to make sure nothing's messed up (all OK), try to take full control of the entire drive (failed), try to take ownership of the entire drive (failed) MS KB articles and Googling around suggests there's a utility called CSCCMD that's meant for this exact scenario...but it looks like it's available for XP, not Vista, no? Again, afterwards, how can I reinitialize the Offline Files cache for every user, without doing it manually, workstation by workstation? For problem 2: a) Figure out which users are on roaming profiles, and where their profiles 'live' on the server. Create new folders for them in the redirected folders repository, migrate existing data, and disable the roaming. With this: Finding out who's roaming isn't hard. But what's the best way to disable the roaming itself? In AD Users and Computers, or on each user's workstation? Doing it centrally on the server seems more efficient; that said, all of the KB research I've done turns up articles on how to go from local to roaming, not the other way around, so I don't have good documentation on this. In closing: we have good backups of NEWBOX and OLDBOX, but not of the workstations themselves, so anything drastic on the client side would need imaging and testing for safety. Thanks for reading along this far! Hopefully you can help me dig us out of this mess.

    Read the article

  • Team Foundation Server - A programmer's guide

    - by Filip Ekberg
    In addition to my Previous topic on How to use SVN, Branch? Tag? Trunk? I would like to get in-depth on how a programmer should/could use TFS. The things that are most interesting to me is not how to set up the server, rather how you use it on a daily basis. In the area of software engineering where your responsibility not only lies on code but achitecture, documentation and other fields, you need to have a collection of your work, prefferably on the same place. So these are my point of interest which I would like to get more knowledge about How would you strucuter a TFS Workspace / Project to support lots of different customers / projects and maybe different projects per customer? Splitting up the folder strucutre on the above project into different pieces such as, Code, Documents - Architecture, Requirements and other, what more could there be and what would be a nice commonly used folder structure? An easy to browse repository; Again the folder structure here is important however this point is more aimed at different Explorers for the repository, not only the built in Team Foundation Explorer. These are just a couple of the points that I would like to know more about, suggestions on Beginners guides, in-depth guides and links covering the above would be very much helpful, please feel free to add other important knowledge-points to this as well.

    Read the article

  • Looking for Recommendations on Version Control System with Intuitive (.NET) API?

    - by John L.
    I'm working on a project which generates (composite) Microsoft Word documents which are comprised of one or more child documents. There are tens of thousands of permutations of the composite documents. Far too many for users to easily manage. Users will need to view/edit the child documents through the app which hides all of the nasty implementation details. A requirement of the system is that the child documents must be version controlled. That is what has been tripping me up. I've been torn between using an off-the-shelf solution or rolling my own. At a minimum, the system needs to support get latest, get specific version, add new, rename and possibly delete. I’ve whiteboarded it enough to realize it won’t be a trivial task to create my own. As far as commercial systems I have VSS and TFS at my disposal. I've played with the TFS API some, but it isn’t as intuitive or well documented as I had hoped. I'm not averse to an open source solution (e.g. SVN), but I have less familiarity with them. Which approach or tool would you recommend? Why? Do you have any links to API documentation you would recommend? Environment: C#, VS2008, SQL Server 2005/2008, low volume (a few hundred operations per day)

    Read the article

  • How to migrate project from RCS to git? (SOLVED)

    - by Norman Ramsey
    I have a 20-year-old project that I would like to migrate from RCS to git, without losing the history. All web pages suggest that the One True Path is through CVS. But after an hour of Googling and trying different scripts, I have yet to find anything that successfully converts my RCS project tree to CVS. I'm hoping the good people at Stackoverflow will know what actually works, as opposed to what is claimed to work and doesn't. (I searched Stackoverflow using both the native SO search and a Google search, but if there's a helpful answer in the database, I missed it.) UPDATE: The rcs-fast-export tool at http://git.oblomov.eu/rcs-fast-export was repaired on 14 April 2009, and this version seems to work for me. This tool converts straight to git with no intermediate CVS. Thanks Giuseppe and Jakub!!! Things that did not work that I still remember: The rcs-to-cvs script that ships in the contrib directory of the CVS sources The rcs-fast-export tool at http://git.oblomov.eu/rcs-fast-export in versions before 13 April 2010 The rcs2cvs script found in a document called "CVS-RCS- HOW-TO Document for Linux"

    Read the article

  • Using git (or some other VCS) at your company

    - by supercheetah
    Some friends of mine and I were talking recently about version control, and how they were using VSS at their jobs, and were probably going to be moving off of that soon. One of them said that his company will likely be going with Team Foundation Server. Eventually, the conversation did get around to talking about some of the open source VCSes out there, including git and SVN. None of us really knew about any companies that use either of these internally, although we imagined that a number of them did so for SVN, but we weren't too sure about git. I brought up Google and Android using it, but my friend figured that's only for the public facing source code, and that they may use something different for internal projects. Apparently it's more than just SCM that makes TFS so intriguing: Microsoft Sales people and support (although my friend did point out somethings to his managers that he thought might be misleading on MS' part) Integration of things beyond SCM, including project management (I'm just finding out that there are geared towards the same things for git) Again, it's Microsoft, and the transition from VSS to TFS seems logical (or does it?) I'm not much of a fan of SVN, so I didn't really bring it up much, but I am curious about whether or not git is used at your company for internal projects. Have you thought about it, and decided against it? Any reason why?

    Read the article

  • Designers, Expression or SharePoint Designer, and real source control

    - by David Lively
    I'm trying desperately to move from VSS to a real source control system. Options include TFS and SVN. My designers need to keep their ability to modify source files and instantly preview their changes in a browser without having to commit their changes. Using FPSE with VSS, this works flawlessly, since saving a file causes the copy in the working folder on the dev server to be updated, so they can just save and refresh their browser which is pointed at the dev server. The site in question consists of 350k+ lines of classic ASP code and some new ASP.NET MVC. They only need to be able to modify views within the MVC code, not C#. Though Expression includes a version of Cassini for local debugging, Cassini does not support classic ASP. Surely someone has solved this problem before. It can't be necessary to install IIS on each designer's machine (this is absolutely untenable). I need a way to have a common working folder on a dev webserver updated whenever someone saves a file locally, just like using FPSE. I'd rather not write an FPSE proxy that knows how to talk to TFS/SVN. Any suggestions? (I know I've asked this question in the past, but I haven't yet found a solution.)

    Read the article

  • Migrate from Oracle to MySQL

    - by Cassy
    Hi together. We ran into serious performance problems with our Oracle database and we would like to try to migrate to a MySQL-based database (either MySQL directly or, more preferrable, Infobright). The thing is, we need to let the old and the new system overlap for at least some weeks if not months, before we actually know, if all features of the new database match our needs. So, here is our situation: The Oracle database consists of multiple tables with each millions of rows. During the day, there are literally thousands of statements, which we cannot stop for migration. Every morning, new data is imported into the Oracle database, replacing some thousands of rows. Copying this process is not a problem, so we could, in theory, import in both databases in parallel. But, and here lies the challenge, for this to work, we need to have an export from the Oracle database with a consistent state from one day. (We cannot export some tables on Monday and some others on Tuesday, etc.) This means, that at least the export should be finished in less than one day. Our first thought was to dump the schema, but I wasn't able to find a tool to import an Oracle dump file into mysql. Exporting tables in CSV files might work, but I'm afraid it could take too long. So my question now is: What should I do? Is there any tool to import Oracle dump files into MySQL? Does anybody have any experience with such a large-scale migration? Thanks in advance, Cassy PS: Please, don't suggest performance optimization techniques for Oracle, we already tried a lot :-)

    Read the article

  • Export to excel in TFS 2012

    - by Martin
    I've installed Visual studion 2012 RC. I'm working withTFS2012 and When I want to export an user story to excel I see the following error: "TF400422: Failed to open in Microsoft Excel: Error loading type library/Dll. (Exception from HRESULT: 0x80029C4A (TYPE_E_CANTLOADLIBRARY))" I'm using Microsoft Office Professional Plus 2010. Version: 14.0.6112.5000 (64-bit) This page Shows the meaning of many error number but this error (code: TF400422) isn't there. Perhaps this relates to this other issue. Thanks!!

    Read the article

  • Html.EditorFor, Html.DisplayFor not working on MVC1.0 -> MVC2.0 manual migration

    - by lawrence-chase
    Has anyone encountered this problem? I manually migrated a MVC1.0 application to MVC2.0 and everything so far seems to be working fine. Today I wanted to try out the Html.EditorFor helper and it doesn't render the template. I set it up the same way in a fresh MVC2.0 application and the template does render. Is there anything other (or specifically needed when mirgrating to activate this behavior) than throwing the partial view like DateTime.ascx into Views/Shared/EditorTemplates and using the helper methods to render the model objects? I'm stumped.

    Read the article

  • Problems with builds on TFS 2010 and resolving dependencies

    - by Jimmy Engtröm
    Hi I have a project that works great on my machine (and production servers). It's a VS2010 project running C#3.5. When letting my build server build the solution it can't resolve a couple of my third party dll's. Error message: C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets(1360,9): warning MSB3268: The primary reference "Third.Party.Assembly, Version=50.11.2.0, Culture=neutral, PublicKeyToken=0561a7c6dbd6f0ea, processorArchitecture=MSIL" could not be resolved because it has an indirect dependency on the framework assembly "Microsoft.VisualBasic.Compatibility, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" which could not be resolved in the currently targeted framework. ".NETFramework,Version=v3.5". To resolve this problem, either remove the reference "Third.Party.Assembly, Version=50.11.2.0, Culture=neutral, PublicKeyToken=0561a7c6dbd6f0ea, processorArchitecture=MSIL" or retarget your application to a framework version which contains "Microsoft.VisualBasic.Compatibility, Version=8.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a". [d:\Builds\3\mySolution.sln] Everything compiles and runs great on my machine, but the build server seem to struggle. I think the Third.Party.Assembly is written in VB.net. Since the assembly is third party I can't remove the reference to "Microsoft.VisualBasic.Compatibility" and since I don't get any warnings on my computer could it really be that I'm running v3.5? Any suggestions? /Jimmy

    Read the article

  • TFS Build errors TF224003, TF215085, TF215076

    - by iamdudley
    Hi, I am using TFS2008 and VS2008. I run nightly builds for about 20 applications using one build agent and the builds are scheduled for either 1am or 2am. Most of the build succeed, however 6 of them fail regularly with similar errors. The errors are either the first two below, or the third one by itself: TF215085: An error occurred while connecting to agent \xxxx\BUILDMACHINE: TF215076: Team Foundation Build on computer BUILDMACHINE (port 9191) is not responding. (Detail Message: The request was aborted: The operation has timed out.) 11/04/2010 2:10:10 AM TF224003: An exception occurred on the build computer BUILDMACHINE: The build (vstfs:///Build/Build/2632) has already completed and cannot be started again.. TF215085: An error occurred while connecting to agent \yyyyy\BA_WKSTFSBUILD: Team Foundation services are not available from server srvtfs. Technical information (for administrator): The operation has timed out It looks to me like some kind of communication error, maybe the port gets over loaded - can this happen? Should I spread the builds out a bit more? In the build definition it says "Queue the build on the default build agent at", so I figured if I scheduled them to start at the same time they would be queued and occur sequentially. Most of the suggestions I've found online for these errors are for all or nothing scenarios where no builds work at all whereas my problem is most build but some consistently do not. Judging by the dates of the last successful builds of these 6 failing builds I believe it is the same 6 failing every night. (I'm editing the build definitions now to keep the failed builds so I can get some more info on the problem) Any help on this would be much appreciated. James.

    Read the article

  • msdeploy IIS 6 to 7 migration issue

    - by rboorgapally
    I am trying to view the dependencies of my website on IIS 6.0 running on windows server 2003. When I type the following command, msdeploy -verb:getDependencies -source:metakey=lm/w3svc/1 I got the following error: C:\Program Files\IIS\Microsoft Web Deploy>msdeploy -verb:getDependencies -source :metakey=lm/w3svc/1 Error: Object of type 'metaKey' and path 'lm/w3svc/1' cannot be created Error: The metabase key '/lm/w3svc/1' could not be found. Error: Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED)) Error count: 1 Can any one explain these to me?

    Read the article

  • iPhone - Core Data crash on migration

    - by Sergey Zenchenko
    I have problems, when i install app from Xcode all works but if I build app and install it from iTunes I have an error with the database at launch. This happens only than i have changes in the core data model and need to migrate to a new version. At first launch at crashes with message: Thread 0: 0 libSystem.B.dylib 0x00034588 pwrite + 20 1 libsqlite3.dylib 0x000505ec _sqlite3_purgeEligiblePagerCacheMemory + 2808 2 libsqlite3.dylib 0x000243d8 sqlite3_backup_init + 7712 3 libsqlite3.dylib 0x000244ac sqlite3_backup_init + 7924 4 libsqlite3.dylib 0x0000d418 sqlite3_file_control + 4028 5 libsqlite3.dylib 0x000228b4 sqlite3_backup_init + 764 6 libsqlite3.dylib 0x00022dd0 sqlite3_backup_init + 2072 7 libsqlite3.dylib 0x000249a8 sqlite3_backup_init + 9200 8 libsqlite3.dylib 0x00029800 sqlite3_open16 + 11360 9 libsqlite3.dylib 0x0002a200 sqlite3_open16 + 13920 10 libsqlite3.dylib 0x0002ab84 sqlite3_open16 + 16356 11 libsqlite3.dylib 0x00049418 sqlite3_prepare16 + 54056 12 libsqlite3.dylib 0x00002940 sqlite3_step + 44 13 CoreData 0x00011958 _execute + 44 14 CoreData 0x000113e0 -[NSSQLiteConnection execute] + 696 15 CoreData 0x000994be -[NSSQLConnection prepareAndExecuteSQLStatement:] + 26 16 CoreData 0x000be14c -[_NSSQLiteStoreMigrator performMigration:] + 244 17 CoreData 0x000b6c60 -[NSSQLiteInPlaceMigrationManager migrateStoreFromURL:type:options:withMappingModel:toDestinationURL:destinationType:destinationOptions:error:] + 1040 18 CoreData 0x000aceb0 -[NSStoreMigrationPolicy(InternalMethods) migrateStoreAtURL:toURL:storeType:options:withManager:error:] + 92 19 CoreData 0x000ad6f0 -[NSStoreMigrationPolicy migrateStoreAtURL:withManager:metadata:options:error:] + 72 20 CoreData 0x000ac9ee -[NSStoreMigrationPolicy(InternalMethods) _gatherDataAndPerformMigration:] + 880 21 CoreData 0x0000965c -[NSPersistentStoreCoordinator addPersistentStoreWithType:configuration:URL:options:error:] + 1328 At next launches app doesn't loads data from database.

    Read the article

  • What version control tool can generate this header

    - by Stan
    This is from a sql script. What tool can generate this? Thanks --USE [MY_TABLE] GO /****** Object: StoredProcedure [dbo].[adminIncExp] Script Date: 03/05/2010 09:14:12 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO -- ============================================= -- Author: <Author,,Name> -- Create date: <Create Date,,> -- Description: <Description,,> -- =============================================

    Read the article

  • Best tool for developing CSS positioning code

    - by alchemical
    What's the best way to develop CSS, in particular if it will play a role in formatting the page, etc. Is this best done by hand or with a particular editor? If by hand, how would you know the values needed to specify the positioning? If a tool is best, can VS2008/2010 do the job, or are there better alternatives. By the way, this is for an ASP.Net based web site.

    Read the article

  • HTML Doc Tool for Delphi 2009

    - by Smasher
    Is there any free HTML creating documentation tool that fully understands Delphi 2009 features like generics and anonymous methods? I tried DelphiCodeToDoc but it crashes while parsing the source code.

    Read the article

  • Any Flex 4 migration experience?

    - by Gok Demir
    My current development stack is MySQL + iBatis + Spring + Spring BlazeDS Integration 1.01 + BlazeDS 3.2 and Flex 3 with Mate 0.8.9 framework. Now Flash Builder 4 beta 2 is out. There are cool features like Data Centric Development (DCD), form generation etc... Do you know how Spring Blazeds Integration works with BlazeDS 4? What about Mate? Is there any issues with Flex 4 ? How DCD suits with mate eventmaps. I know it is better to try it out myself but I just want to check if somebody ever tried to migrate Flex 4. If so what are the issues? Did you notice any productivity speed up? Thanks.

    Read the article

  • sqlalchemy date type in 0.6 migration using mssql

    - by nosklo
    I'm connection to mssql server through pyodbc, via FreeTDS odbc driver, on linux ubuntu 10.04. Sqlalchemy 0.5 uses DATETIME for sqlalchemy.Date() fields. Now Sqlalchemy 0.6 uses DATE, but sql server 2000 doesn't have a DATE type. How can I make DATETIME be the default for sqlalchemy.Date() on sqlalchemy 0.6 mssql+pyodbc dialect? I'd like to keep it as clean as possible. Here's code to reproduce the issue: import sqlalchemy from sqlalchemy import Table, Column, MetaData, Date, Integer, create_engine engine = create_engine( 'mssql+pyodbc://sa:sa@myserver/mydb?driver=FreeTDS') m = MetaData(bind=engine) tb = sqlalchemy.Table('test_date', m, Column('id', Integer, primary_key=True), Column('dt', Date()) ) tb.create() And here is the traceback I'm getting: Traceback (most recent call last): File "/tmp/teste.py", line 15, in <module> tb.create() File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/schema.py", line 428, in create bind.create(self, checkfirst=checkfirst) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1647, in create connection=connection, **kwargs) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1682, in _run_visitor **kwargs).traverse_single(element) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/sql/visitors.py", line 77, in traverse_single return meth(obj, **kw) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/ddl.py", line 58, in visit_table self.connection.execute(schema.CreateTable(table)) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1157, in execute params) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1210, in _execute_ddl return self.__execute_context(context) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1268, in __execute_context context.parameters[0], context=context) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1367, in _cursor_execute context) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/base.py", line 1360, in _cursor_execute context) File "/home/nosklo/.local/lib/python2.6/site-packages/sqlalchemy/engine/default.py", line 277, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.ProgrammingError: (ProgrammingError) ('42000', '[42000] [FreeTDS][SQL Server]Column or parameter #2: Cannot find data type DATE. (2715) (SQLExecDirectW)') '\nCREATE TABLE test_date (\n\tid INTEGER NOT NULL IDENTITY(1,1), \n\tdt DATE NULL, \n\tPRIMARY KEY (id)\n)\n\n' ()

    Read the article

  • Django data migration when changing a field to ManyToMany

    - by Ken H
    I have a Django application in which I want to change a field from a ForeignKey to a ManyToManyField. I want to preserve my old data. What is the simplest/best process to follow for this? If it matters, I use sqlite3 as my database back-end. If my summary of the problem isn't clear, here is an example. Say I have two models: class Author(models.Model): author = models.CharField(max_length=100) class Book(models.Model): author = models.ForeignKey(Author) title = models.CharField(max_length=100) Say I have a lot of data in my database. Now, I want to change the Book model as follows: class Book(models.Model): author = models.ManyToManyField(Author) title = models.CharField(max_length=100) I don't want to "lose" all my prior data. What is the best/simplest way to accomplish this? Ken

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >