Search Results

Search found 10515 results on 421 pages for 'automatically'.

Page 275/421 | < Previous Page | 271 272 273 274 275 276 277 278 279 280 281 282  | Next Page >

  • Query for value where a default namespace node exists

    - by Jay
    I have the following XML that is provided to me and I cannot change it: <Parent> <Settings Version="1234" xmlns="urn:schemas-stuff-com"/> </Parent> I am trying to retrieve the "Version" attribute value using XPath. It appears since the xmlns is defined without an alias it automatically assigns that xmlns to the Settings node. When I read this XML into an XMLDocument and view the namespaceURI value for the Settings node it is set to "urn:schemas-stuff-com". I have tried: //Parent/Settings/@Version - returns Null //Parent/urn:schemas-stuff-com:Settings/@Version - invalid syntax

    Read the article

  • setting codeigniter mysql datetime column to time() always sets it to 0

    - by Jake
    Hi guys. I'm using Codeigniter for a small project, and my model works correctly except for the dates. I have a column defined: created_at datetime not null and my model code includes in its array passed into db-insert: 'created_at' = time() This produces a datetime value of 0000-00-00 00:00:00. When I change it to: 'created_at' = "from_unixtime(" . time() . ")" it still produces the 0 datetime value. What am I doing wrong? How can I set this field to the given unix time? Also, I know mysql sets TIMESTAMP columns automatically for you - I'm not interested in that solution here. So far I can't find a complete example of this on the web.

    Read the article

  • Disable system sleep during long builds

    - by Paul Alexander
    From time to time I need to run a full build of the entire tool chain for our software on my development machine. To save on power my I've got my dev machine set to go sleep after 20 minutes of inactivity. Building the full tool chain can take up to an hour and I'll often just go to lunch. However, if I forget to disable sleep I can return to a sleeping machine with the build only partially complete. What I'm looking for is a way to automatically disable sleep while MSBuild is running. Does anyone know of a simple way of doing this?

    Read the article

  • Is it possible to link directories in git?

    - by Andreas Selenwall
    I will start with a simplified example describing my intent. I have a repository my-rep.git containing two directories, src and deploy. In src I have my source code (NodeJS code, but that doesn't matter), and in deploy I want to keep my deploy configuration. So for example if I have a project, projectA, then the structure should look like this, my-rep.git/src/projectA my-rep.git/deploy/projectA/dotcloud.yml my-rep.git/deploy/projectA/src Now to my question. I want the source code in projectA to be available in the deploy directory for dotcloud. Is there any way I can make my-rep.git/deploy/projectA/src point to my-rep.git/src/projectA, that is, so when I do a git pull in deploy it will automatically pull the my-rep.git/deploy/projectA/src. It must be supported in git, symbolic linux links won't work as some developers in my team work in Windows.

    Read the article

  • When not to use a Drupal node?

    - by stotastic
    I've recently created a very simple CRUD table where the user stores some data. For the data, I created a custom node. The functionality works great for creating, editing, and deleting data in the CRUD table using the basic node functionality (I'm actually amazed how fast and easy it was to program the basic functionality with proper access controls using only a tiny bit of code).... Since the data isn't meant to be treated the same way as 'content' such as a blog post (no title, no body, no commments, no revisions, shouldn't show up on ?q=node page, no previews, no teasers, etc)... I find that I'm spending most of my time 'turning off' and modifying the stuff that drupal does automatically for nodes. I know its a matter of taste, but where should one draw the line on what should be treated as a node and what shouldn't? In other words, would it be better to program this stuff from scratch without using nodes?

    Read the article

  • Conditionally embed ASP.NET MVC2 Views as resources during build in Visual Studio 2010

    - by jslatts
    I have a ASP.NET MVC2 project in VS2010 that can be deployed in two modes: standalone or plugin. In standalone mode, the views should live outside the compiled assembly as .aspx files (the default setup). In plugin mode, the views are switched (currently by hand) to embedded resources and the entire assembly is dropped into a host project folder. Currently, this requires the developer to go through each view and switch it from Build Action: "Content" to "Embedded Resource" and vice versa. I would like to create a new solution configuration to automatically grab all .aspx files and build them as resources. This SO post seems like the solution, but I would prefer not to have to edit the .csproj every single time I add a new view to the project. Is there a way to use a wild cards or some other batch/global conditionally statement to change resources from content to embedded?

    Read the article

  • Delphi constants and references

    - by Sambatyon
    I want to pass constant references to functions in delphi, so I am sure that the referenced object won't change and to save time and memory. So I want to declare a function like function foo(var const Value : Bar) : Boolean; however this is not allowed. I thought constant values would be automatically sent as references. However I found out that it is not the case (getting the address of an object before sending it to the function gives me $12F50C and the address of the same object inside the function is $12F564) What can I do to send constant references?

    Read the article

  • Powershell 2.0 - Running scripts for the command line call vs. from the ISE

    - by Gromix
    Hi, After writing deployment scripts from within the ISE, we need our CI server to be able to run them automatically, i.e. from the command line or via a batch file. I have notice some significant differences between the following calls: powershell.exe -File Script.ps1 powershell.exe -Command "& '.\Script.ps1'" powershell.exe .\Script.ps1 Some simple examples: When using -File, errors are handled in the exact same way as the ISE. The other two calls seem to ignore the $ErrorActionPreference variable, and do not catch Write-Error in try/catch blocks. When using pSake: The last 2 calls work perfectly Using the ISE or the -File parameter will fail with the following error: The variable '$script:context' cannot be retrieved because it has not been set Could someone help me understand the implications of each syntax, and why they are behaving differently? I would ideally like to find a syntax that works all the time and behaves like the ISE. Thanks, Romain

    Read the article

  • Is it a solvable problem to generate a regular expression that matches some input set?

    - by Roman
    I provide some input set which contains known separated number of text blocks. I want to make a program that automatically generate 1 or more regular expressions each of which matches every text block in the input set. I see some relatively easy ways to implement a brute-force search. But I'm not an expert in compilers theory. That's why I'm curious: 1) is this problem solvable? or there are some principle impossibility to make such algorithm? 2) is it possible to achieve polynomial complexity for this algorithm and avoid brute forcing?

    Read the article

  • How can one cache bust files referenced in a LESS file when using Symfony2, Twig, and Assetic?

    - by user3719083
    I have a web site built on Symfony2 which uses twig templates, LESS, and assetic. In order to cache bust assets, I'm simply using this in my config.yml: framework: templating: engines: ['twig'] assets_version: 'asset-version-here' And then I use the asset() function to load the asset and the cache busting is handled for me. However, the concern I have is when I load my LESS (css) file, there are references to other files, and I would like to know how these files can be cache busted as well. Example: .someSelector { background:url('../images/filename.png'); } How can I make sure that the referenced file, filename.png is cache busted upon deployment? The asset files referenced in Twig using asset() are cache busted automatically upon deployment (I use a deployment script hook that updates the assets_version in the framework's config), but those referenced in a stylesheet are not. How can I do this?

    Read the article

  • Batch backup a harddrive without modifying access times C#

    - by johnathan-doena
    I'm trying to write a simple program that will backup my flash drive. I want it to work automatically and silently in the background, and I also want it to be as quick as possible. The thing is, resetting all the access times is useless to me, and something I want to avoid. I know I can read the access times and set them back, but I bet it will fail one day in the future. It would be much simpler to read the files without ever changing it. Also, what is the fastest way to do this? What differences would there be between, say, a flash drive and an external hard drive. I am writing this in C#, as it is the simplest way to do it and it will probably last more generations of Windows..

    Read the article

  • InnoDB Cascade Rule that looks at 2 columns?

    - by Travis
    I have the following mysql InnoDB tables... TABLE foldersA ( ID title ) TABLE foldersB ( ID title ) TABLE records ( ID folderID folderType title ) folderID in table "records" can point to ID in either "foldersA" or "foldersB" depending on the value of folderType. (0 or 1). I am wondering: Is there a way to create a CASCADE rule such that the appropriate rows in table records are automatically deleted when a row in either foldersA or folderB is deleted? Or in this situation, am I forced to have to delete the rows in table "records" programatically? Thanks for you help!

    Read the article

  • Mobile web on nokia devices not displaying centain elements

    - by Jan de Jager
    So i have a site which is rendered with our in-house portal engine. It resizes images and adjusts style-sheets automatically in real-time. Issue is that some html elements are inexplicably disappearing due to what only can be described as HTML compatibility. But the problem is not consistent. And only seems to be an issue on some nokia devices. I have tried to install the Nokia Mobile Browser Emulator... but its the worst piece of software i have seen in my life... after 4 hours of installing and uninstalling different versions of JRE, i still can't get it to install. EDIT: Problem now residing at http://wiseguy.mobi/?PageID=657

    Read the article

  • distributing R package with optional S4 syntax sugar

    - by mariotomo
    I've written a small package for logging, I'm distributing it through r-forge, recently I received some very interesting feedback on how to make it easier to use, but this functionality is based on stuff (setRefClass) that was added to R in 2.12. I'd like to keep distributing the package also for R-2.9, so I'm looking for a way to include or exclude the S4 syntactical sugar automatically, and include it when the library is loaded on a R = 2.12 system. one other option I see, that is to write a small S4 package that needs 2.12, imports the simpler logging package and exports the syntactically sugared interface... I don't like it too much, as I'd need to choose a different name for the S4 package.

    Read the article

  • Git: What is a tracking branch?

    - by jerhinesmith
    Can someone explain a "tracking branch" as it applies to git? Here's the definition from git-scm.com: A 'tracking branch' in Git is a local branch that is connected to a remote branch. When you push and pull on that branch, it automatically pushes and pulls to the remote branch that it is connected with. Use this if you always pull from the same upstream branch into the new branch, and if you don't want to use "git pull" explicitly. Unfortunately, being new to git and coming from SVN, that definition makes absolutely no sense to me. I'm reading through "The Pragmatic Guide to Git" (great book, by the way), and they seem to suggest that tracking branches are a good thing and that after creating your first remote (origin, in this case), you should set up your master branch to be a tracking branch, but it unfortunately doesn't cover why a tracking branch is a good thing or what benefits you get by setting up your master branch to be a tracking branch of your origin repository. Can someone please enlighten me (in English)?

    Read the article

  • How to automate login to Google API to get OAuth 2.0 token to access known user account

    - by keyser_sozay
    Ok, so this question has been asked before here. In the response/answer to the question, the user tells him to store the token in the application (session and not db, although it doesn't matter where you store it). After going through the documentation on Google, it seems that the token has an expiration date after which it is no longer valid. Now, we could obviously automatically refresh the token every fixed interval, thereby prolonging the lifespan of the token, but for some reason, this manual process feels like a hack. My questions is: Is this most effective (/generally accepted) way to access google calendar/app data for a known user account by manually logging in and persisting the token in the application? Or is there another mechanism that allows us to programmatically login to this user account and go through the OAuth steps?

    Read the article

  • PHPUnit 3.6.10 + ZendFramework 1.11.11 + NetBeans 7.1

    - by Vegetus
    When I create a controller with following through Zend_Tool command: create controller NameController ... netbeans creates a new controller successfully. BUT, it sends a message in the output window: PHPUnit is required in order to generate controller test stubs. How do I get Netbeans to automatically create a class with phpunit testing? I searched the net and thousands solutions so far not able to solve ... Important Notes: Have already installed the pear Already installed phpunit Already included 'phpunit' in the include_path Has created a file and includes zf.ini too in the include_path Options already configured netbeans, where I informed the directory phpunit.bat. Is it a bug in NetBeans? See:

    Read the article

  • Self-signed ceritificates for many users/browsers/sites

    - by Demiurg
    Here is my problem - I have a lot of users using different browsers accessing many internal web sites using https. I can create my own Certificate Authority, than create a certificate for each server and after that have all the users import it. Obviously, it cannot work in reality - there are too many users and too many sites, and some sites will be added in the future. I'm looking for a way to automate this. Is there a way to create a certificate so that all major browsers (IE, FF, Opera, Chrome and Safari) would trust it for all servers ? If so, what is the best way to install it automatically in all major browsers ?

    Read the article

  • Working with multiple GIT severs

    - by th3flyboy
    Hello, I have a question. Is it possible to set up a system so that you have a private GIT server that you host, which automatically syncs with a remote one, hosted by a site like Sourceforge, and then you can commit your local to the private GIT server, and then when you have to merge the changes from your private wip branches that are on your private GIT over to the master/branch/tag from the public GIT, and then push the change to the public GIT? I ask this because I have a lot of personal work I would like to get working before putting it up for the public to see, and I'm shifting between several computers/operating systems in the process. If this is not possible in standard GIT, are there any other options that would allow me to do this? Thanks, Peter

    Read the article

  • disabling transactional fixtures in Rspec has no effect

    - by Dia
    Due to a legacy database I'm using, I'm stuck with MySQL using MyISAM, which means my tables don't support transactions. This is causing the tests to fail, since table data generated (I'm using factory_girl for fixtures) by the tests are not reverted for each scenario. I discovered that Rspec exposes the config.use_transactional_fixtures config setting in spec_helper.rb. which is set to true by default. When I set it to false, I don't see any effect on my tests; they still fail due to duplicate records. Isn't that setting supposed to automatically unroll any changes made to the DB? Or am I supposed to do that manually?

    Read the article

  • Reloading Rails Directories: Not Lib!

    - by yar
    I have checked out several questions on this, including all of those you see next to the question. Unfortunately, I'm not working with a plugin, and I don't want to work in lib. I have a directory called File.join(Rails.root, 'classes') and I'd like the classes in this directory to reload automatically in dev. In my environment.rb I have this line config.load_paths << File.join(Rails.root, 'classes') which works fine and blows up if the path isn't there. The reloading line in my development.rb also works fine require_dependency File.join(Rails.root, 'classes', 'blah.rb') which blows up if the file is not there (a good sign). However, the file doesn't reload. This all works if the file is in the root of lib and I use the require_dependency line, but my whole point is to get stuff out of lib as suggested here.

    Read the article

  • AutoMapper determine what to map based on generic type

    - by Daz Lewis
    Hi, Is there a way to provide AutoMapper with just a source and based on the specified mapping for the type of that source automatically determine what to map to? So for example I have a type of Foo and I always want it mapped to Bar but at runtime my code can receive any one of a number of generic types. public T Add(T entity) { //List of mappings var mapList = new Dictionary<Type, Type> { {typeof (Foo), typeof (Bar)} {typeof (Widget), typeof (Sprocket)} }; //Based on the type of T determine what we map to...somehow! var t = mapList[entity.GetType()]; //What goes in ?? to ensure var in the case of Foo will be a Bar? var destination = AutoMapper.Mapper.Map<T, ??>(entity); } Any help is much appreciated.

    Read the article

  • Safari/Chrome (Webkit) - Cannot hide iframe vertical scrollbar

    - by BrainCore
    Hello, I have an iframe on www.mydomain.com that points to support.mydomain.com (which is a CNAME to a foreign domain). I automatically resize the height of my iframe so that the frame will not need any scrollbars to display the contained webpage. On Firefox and IE this works great, there is no scrollbar since I use <iframe ... scrolling="no"></iframe>. However, on webkit browsers (Safari and Chrome), the vertical scrollbar persists even when there is sufficient room for the page without the scrollbar (the scrollbar is grayed out). How do I hide the scrollbar for webkit browsers? Thanks, Ken

    Read the article

  • DOM: how to import nodes and give them different namespace prefix

    - by thomasrutter
    I'm familiar with the DOMDocument::importNode method for importing a tree of nodes from some other document element. However, what I was wondering is if I can automatically change the namespace prefix on a tree of nodes as I import them, that is, specify a new prefix for all nodes of that namespace. Say the nodes, in their existing document, all have names like "name", "identity", and so on. When importing them into my new document they will be alongside other namespaces, so I'd like them to appear as "nicnames:name", "nicnames:identity" and so on. I'd like to be able to change this prefix programmatically so that in another context I may be able to import them as, for instance, "myprefix:name", "myprefix:identity" depending on the document they're imported into. Can anyone help me understand how to do this? Thanks

    Read the article

  • Robocopy for Windows 2003 doesn't support /DST option

    - by Jon
    Does anyone know if it is possible to download the latest robocopy for Windows 2003. The latest version provides the /DST option which ignores time stamps changed due to BST (British Summer Time). Every time we do a build and sync our servers when we go +1/-1 hour it takes hours instead of minutes because it sees everything as changed. I noticed it is included automatically with Vista/Win7 but the Resource toolkit that I downloaded doesn't include a new version of robocopy for Win Server 2003. If there is a place to download it from & will it also work on Windows Server 2003? Thanks.

    Read the article

< Previous Page | 271 272 273 274 275 276 277 278 279 280 281 282  | Next Page >