Search Results

Search found 12390 results on 496 pages for 'zend tool'.

Page 40/496 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Right tool for the job(html5 vs silverlight)

    - by Jargo
    I have been tasked to develop a web based offer request application for bathrooms and I am asking opinions on which tool should I choose. I have narrowed options down to two (html5 and silverlight). Most of the application consists of basic UI elements that could be achieved with html4. However most challenging part is the design view for the bathroom. User first defines size of the bathroom by giving dimensions. When the dimensions are defined the application renders an empty bathroom floorplan and user can start dragging&dropping furniture on to it. Some furniture have rules where they can be dragged. For example sinks can only be dragged next to a wall. I know that this functionality can be achieved with silverlight but what about html5 ? I know that Canvas element can be used to draw graphics, but can it render the bathroom smoothly as user drags a piece furniture on it?

    Read the article

  • 3D Paint tool in Maya

    - by Joris
    Hey everyone, could someone help me out on this question? When I am trying to texture objects in maya (for example a barrel) I want to texture it with a brush. When I try to use the 3D painting tool it al gets black even when I have a image file selected. Also the resolution is very very low of the models and textures in maya, though the texture image is very high quality. Can someone please help me? Thanks so much for helping.

    Read the article

  • Software, script or a tool to automate managing which tests to run

    - by laggingreflex
    I have a batch file that lists all the test files I have and asks me which test I want to perform, like Test. [U]nit, [I]ntegration : i (user input) Integration. [A]ll, [2][U]serInteraction, [3][R]esultGeneration : u 2 User Interaction. Running "mocha integration\2userint.js" ... So essentially I have configured a batch "option" for each test file I have, which I can choose to run individually or all together. But adding and removing tests is a pain. I have to update the batch file everytime a new file is added or changed. Is there a software, script or a tool, that does this automatically, or makes it easier for me to do so? I basically need it to be aware of and ask me which file(s) I want to test. A GUI with checkboxes would be ultimate! but I'll take anything. I'm working in node.js

    Read the article

  • VMMap - awesome memory analysis tool

    VMMap is a process virtual and physical memory analysis utility. It shows a breakdown of a process's committed virtual memory types as well as the amount of physical memory (working set) assigned by the operating system to those types. Besides graphical representations of memory usage, VMMap also shows summary information and a detailed process memory map. Powerful filtering and refresh capabilities allow you to identify the sources of process memory usage and the memory cost of application features. Besides flexible views for analyzing live processes, VMMap supports the export of data in multiple forms, including a native format that preserves all the information so that you can load back in. It also includes command-line options that enable scripting scenarios. VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage. span.fullpost {display:none;}

    Read the article

  • VMMap - awesome memory analysis tool

    VMMap is a process virtual and physical memory analysis utility. It shows a breakdown of a process's committed virtual memory types as well as the amount of physical memory (working set) assigned by the operating system to those types. Besides graphical representations of memory usage, VMMap also shows summary information and a detailed process memory map. Powerful filtering and refresh capabilities allow you to identify the sources of process memory usage and the memory cost of application features. Besides flexible views for analyzing live processes, VMMap supports the export of data in multiple forms, including a native format that preserves all the information so that you can load back in. It also includes command-line options that enable scripting scenarios. VMMap is the ideal tool for developers wanting to understand and optimize their application's memory resource usage. span.fullpost {display:none;}

    Read the article

  • Can not Load Type - Starting the ASP.NET Web Site Admin Tool

    A beginner emailed me last night and I had forgotten about this little ditty ! When you create a NEW project and immediately try to run the Web Site Administration Tool you will get this error. The solution is easy BUILD FIRST ! I remember being very confused the first time I got this :) ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Best tool to understand source

    - by cache
    I have a source code for a project. I am working on porting it to another device as the current source code is for a linux environment. I am having some error on the newly ported code. So i was thinking it would be best to once again understand the whole source code and this will help me localise the errors. Now the problem is that i tried using 'gdb' for linux to debug the code but it does not help. So is there any tool that I can use to trace the program line by line ? By doing so i can understand the program flow. Please Help !

    Read the article

  • Fiddler - A useful free tool for checking Web Services (and web site) traffic

    - by TATWORTH
    Recently I had reason to be very glad that I had Fiddler. I was able to record some web service traffic and identify a problem as Fiddler can record both the call to a web service and response from the web service. By seeing the actual data traffic I was able to resolve a problem found in testing in less time than it has taken me to write this blog entry! This tool is also useful for studying general web site traffic. Fiddler is available from http://www.fiddler2.com/fiddler2/ There are training videos available on the above site.

    Read the article

  • Tool Review: Telerik JustDecompile

    - by Sam Abraham
    In the next few lines, I will be providing a brief review of Telerik’s JustDecompile, a free .Net decompiler and assembly browser. In using Telerik’s 2012 Q3 JustDecompile release, one can see many great features.  First off, I loved the built-in options for loading .Net assemblies automatically using the Open->Load Framework menu option. Other options enable loading assemblies from GAC, XAP URL or locally from disk. The ability to create an “Assembly List” is quiet handy for grouping and saving a “List” of DLLs to load. All loaded assemblies are shown in the left panel of a split-panel screen. Clicking an assembly expands all namespaces within. Drilling further to class level displays the actual source code in the right panel in either IL, C# or Visual Basic. In conclusion, JustDecompile has grown and quickly matured into an indispensible handy tool for us developers. Telerik’s effort in maintaining and updating JustDecompile as well as the company’s commitment to keeping it free is much appreciated and valued.

    Read the article

  • Best language or tool for automating tedious manual tasks [closed]

    - by Jon Hopkins
    We all have tasks that come up from time to time that we think we'd be better off scripting or automating than doing manually. Obviously some tools or languages are better for this than others - no-one (in their right mind) is doing a one off job of cross referencing a bunch of text lists their PM has just given them in assembler for instance. What one tool or language would you recommend for the sort of general quick and dirty jobs you get asked to do where time (rather than elegance) is of the essence? Background: I'm a former programmer, now development manager PM, looking to learn a new language for fun. If I'm going to learn something for fun I'd like it to be useful and this sort of use case is the most likely to come up.

    Read the article

  • svn-based versioning tool, problem with network timeout

    - by Scarlet
    My dev team was committed a versioning tool based on Subversion to run on Windows (our svn client is sliksvn). We're developing with Delphi XE2, should that matter. We're asked to implement a "check for updates availability" feature, which has to work as follows: Connect to the SVN repo via svn+ssh protocol; See if there are changes to receive and list them; Let the user decide if he wants to receive changes or not. We don't have a great knowledge on svn, so we thought to implement that thing client side by a certain number of CreateProcess calls that wrap directly proper svn commands. Anyways what we perceived is that if network problems should arise, such like a connection drop, svn client hangs forever waiting for the operation to close instead of failing for timeout. We know that CreateProcess can be given a timeout argument, but it wouldn't be correct to use it, as we can't know from outside how long will be the svn operation taking to complete. Is there any way to avoid that deadlock?

    Read the article

  • Best stats tool for cross-domain traacking

    - by kidbrax
    We build a webapp that allows users to run the app under their own subdomain. So we run the app under search.domainX.com, search.domainY.com and so on. They each have their own Google Analytics to track individual stats. But we want to know what general traffic for all clients of our app. So we want to know stuff like "among all our clients we had x number of views." What is the best way tool to track that sort of thing.

    Read the article

  • Best stats tool for cross-domain tracking

    - by kidbrax
    We build a webapp that allows users to run the app under their own subdomain. So we run the app under search.domainX.com, search.domainY.com and so on. They each have their own Google Analytics to track individual stats. But we want to know what general traffic for all clients of our app. So we want to know stuff like "among all our clients we had x number of views." What is the best way tool to track that sort of thing.

    Read the article

  • Will these tool tips get me penalized?

    - by user21100
    I have a page of 10 products. Each product has a list of (7) features associated with it. If a user hovers over the name of the feature, (ex., "Moisture resistance), a tooltip description displays. The descriptions ( a sentence or two), are loaded once on the page using Javascript, but the titles are not, so I essentially have a bunch of redundant tool tip titles. I am concerned this will look like keyword stuffing to the bots. Anyone know about this? Maybe I should load the feature titles with javascript as well?

    Read the article

  • fault tolerant uploading tool

    - by andersonbd1
    I'm setting up a wordpress site for a friend who has some somewhat large audio files (150M)... He's on a bad connection and I'd guess it'll take him a while to upload those files with the normal wordpress way. I'm looking for a tool that I can install on the server that allows uploads and is also fault tolerant... for example if you loses his connection, or power, or whatever it'll pick up where it left off. I realize web technologies probably don't do that, but perhaps flash or something? Any ideas?

    Read the article

  • Most underestimated programming tool [closed]

    - by Anto
    We have many great tools which helps a lot when programming, such as good programmers text editors, IDEs, debuggers, version control systems etc. Some of the tools are more or less "must have" tools for getting the job done (e.g. compilers). There are still always tools which do help a lot, but still don't get so much attention, for various reasons, for instance, when they were released, they were ahead of their time and now are more or less forgotten. What type of programming tool do you think is the most underestimated one? Motivate your answer.

    Read the article

  • Replication Services as ETL extraction tool

    - by jorg
    In my last blog post I explained the principles of Replication Services and the possibilities it offers in a BI environment. One of the possibilities I described was the use of snapshot replication as an ETL extraction tool: “Snapshot Replication can also be useful in BI environments, if you don’t need a near real-time copy of the database, you can choose to use this form of replication. Next to an alternative for Transactional Replication it can be used to stage data so it can be transformed and moved into the data warehousing environment afterwards. In many solutions I have seen developers create multiple SSIS packages that simply copies data from one or more source systems to a staging database that figures as source for the ETL process. The creation of these packages takes a lot of (boring) time, while Replication Services can do the same in minutes. It is possible to filter out columns and/or records and it can even apply schema changes automatically so I think it offers enough features here. I don’t know how the performance will be and if it really works as good for this purpose as I expect, but I want to try this out soon!” Well I have tried it out and I must say it worked well. I was able to let replication services do work in a fraction of the time it would cost me to do the same in SSIS. What I did was the following: Configure snapshot replication for some Adventure Works tables, this was quite simple and straightforward. Create an SSIS package that executes the snapshot replication on demand and waits for its completion. This is something that you can’t do with out of the box functionality. While configuring the snapshot replication two SQL Agent Jobs are created, one for the creation of the snapshot and one for the distribution of the snapshot. Unfortunately these jobs are  asynchronous which means that if you execute them they immediately report back if the job started successfully or not, they do not wait for completion and report its result afterwards. So I had to create an SSIS package that executes the jobs and waits for their completion before the rest of the ETL process continues. Fortunately I was able to create the SSIS package with the desired functionality. I have made a step-by-step guide that will help you configure the snapshot replication and I have uploaded the SSIS package you need to execute it. Configure snapshot replication   The first step is to create a publication on the database you want to replicate. Connect to SQL Server Management Studio and right-click Replication, choose for New.. Publication…   The New Publication Wizard appears, click Next Choose your “source” database and click Next Choose Snapshot publication and click Next   You can now select tables and other objects that you want to publish Expand Tables and select the tables that are needed in your ETL process In the next screen you can add filters on the selected tables which can be very useful. Think about selecting only the last x days of data for example. Its possible to filter out rows and/or columns. In this example I did not apply any filters. Schedule the Snapshot Agent to run at a desired time, by doing this a SQL Agent Job is created which we need to execute from a SSIS package later on. Next you need to set the Security Settings for the Snapshot Agent. Click on the Security Settings button.   In this example I ran the Agent under the SQL Server Agent service account. This is not recommended as a security best practice. Fortunately there is an excellent article on TechNet which tells you exactly how to set up the security for replication services. Read it here and make sure you follow the guidelines!   On the next screen choose to create the publication at the end of the wizard Give the publication a name (SnapshotTest) and complete the wizard   The publication is created and the articles (tables in this case) are added Now the publication is created successfully its time to create a new subscription for this publication.   Expand the Replication folder in SSMS and right click Local Subscriptions, choose New Subscriptions   The New Subscription Wizard appears   Select the publisher on which you just created your publication and select the database and publication (SnapshotTest)   You can now choose where the Distribution Agent should run. If it runs at the distributor (push subscriptions) it causes extra processing overhead. If you use a separate server for your ETL process and databases choose to run each agent at its subscriber (pull subscriptions) to reduce the processing overhead at the distributor. Of course we need a database for the subscription and fortunately the Wizard can create it for you. Choose for New database   Give the database the desired name, set the desired options and click OK You can now add multiple SQL Server Subscribers which is not necessary in this case but can be very useful.   You now need to set the security settings for the Distribution Agent. Click on the …. button Again, in this example I ran the Agent under the SQL Server Agent service account. Read the security best practices here   Click Next   Make sure you create a synchronization job schedule again. This job is also necessary in the SSIS package later on. Initialize the subscription at first synchronization Select the first box to create the subscription when finishing this wizard Complete the wizard by clicking Finish The subscription will be created In SSMS you see a new database is created, the subscriber. There are no tables or other objects in the database available yet because the replication jobs did not ran yet. Now expand the SQL Server Agent, go to Jobs and search for the job that creates the snapshot:   Rename this job to “CreateSnapshot” Now search for the job that distributes the snapshot:   Rename this job to “DistributeSnapshot” Create an SSIS package that executes the snapshot replication We now need an SSIS package that will take care of the execution of both jobs. The CreateSnapshot job needs to execute and finish before the DistributeSnapshot job runs. After the DistributeSnapshot job has started the package needs to wait until its finished before the package execution finishes. The Execute SQL Server Agent Job Task is designed to execute SQL Agent Jobs from SSIS. Unfortunately this SSIS task only executes the job and reports back if the job started succesfully or not, it does not report if the job actually completed with success or failure. This is because these jobs are asynchronous. The SSIS package I’ve created does the following: It runs the CreateSnapshot job It checks every 5 seconds if the job is completed with a for loop When the CreateSnapshot job is completed it starts the DistributeSnapshot job And again it waits until the snapshot is delivered before the package will finish successfully Quite simple and the package is ready to use as standalone extract mechanism. After executing the package the replicated tables are added to the subscriber database and are filled with data:   Download the SSIS package here (SSIS 2008) Conclusion In this example I only replicated 5 tables, I could create a SSIS package that does the same in approximately the same amount of time. But if I replicated all the 70+ AdventureWorks tables I would save a lot of time and boring work! With replication services you also benefit from the feature that schema changes are applied automatically which means your entire extract phase wont break. Because a snapshot is created using the bcp utility (bulk copy) it’s also quite fast, so the performance will be quite good. Disadvantages of using snapshot replication as extraction tool is the limitation on source systems. You can only choose SQL Server or Oracle databases to act as a publisher. So if you plan to build an extract phase for your ETL process that will invoke a lot of tables think about replication services, it would save you a lot of time and thanks to the Extract SSIS package I’ve created you can perfectly fit it in your usual SSIS ETL process.

    Read the article

  • PHP ZendOptimizer on Red Hat Enterprise Linux

    - by Jacob Kristensen
    I would like to install FlashMoto and the requirements are not unreasonable: PHP 5.2.1 or higher, Zend Optimizer 3.3 or higher. However my RHEL 5.4 provides me with PHP 5.1.6. So I tried the remi repository http://rpms.famillecollet.com/ but it gave me PHP 5.3.1 and Zend Optimizer from zend.com does not support anything higher than 5.2.x. I also tried the dag repo but it does not have PHP in any version. I also tried some RPMs that Oracle provides on their homepage but they don't provide php-mbstring that I also need. Does anyone know how to get PHP 5.2.1 installed on a RHEL 5.4? Then I can probably fix install the Zend thing. Thanks in advance.

    Read the article

  • Online email tracing tool

    - by Clint
    About 2 years ago I came across an online tool that would allow you to append something to the end of a destination email address. When the email was opened, the tool would email you their geographical location. Does anyone know anything about this tool? If it still exists?

    Read the article

  • How to Use WinMerge as the Diff tool for Mercurial

    - by quanticle
    I'm using the Mercurial distributed version control system, and I'm wondering how I can configure it to use WinMerge instead of its own internal diff tool. I've already got WinMerge as the merge tool, but I want Mercurial to use WinMerge when I type: hg diff Is there any way of doing that, or am I stuck with Mercurial's internal diff tool?

    Read the article

  • Media file segmenter like tool for linux

    - by Raja
    Hello everyone I'm looking for a tool for linux which can segment a video file into multiple small .ts files. I know one for Mac OS X called media file segmenter which is a simple command line tool. I really appreciate if anyone can point me to an equivalent linux tool. I don't know if these types of non-programming questions are allowed on this site. my apologies for that.

    Read the article

  • Rsync like windows backup tool

    - by Halfgaar
    Hi, I need to backup some windows machines and have been unable to find the proper tool. What I need is a tool that does efficient copying of changed files to a windows network location, like Rsync does. In turn, the server will then back that up using rdiff-backup, a tool which does very clever incremental backups. Right now I'm using windows' 7 included backup feature, but I really don't get that. It's too much off-topic, but it doesn't suffice (seems buggy as well). I looked into Amanda, but as soon as it wanted to install MySQL, I aborted. I also tried Deltacopy, but unfortunately, I don't remember what the problem with that was... Any advice for an rsync like tool that just does daily syncs to a network location?

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >