Search Results

Search found 5011 results on 201 pages for 'grand master t'.

Page 151/201 | < Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >

  • Bulletproof way to DROP and CREATE a database under Continuous Integration.

    - by H. Abraham Chavez
    I am attempting to drop and recreate a database from my CI setup. But I'm finding it difficult to automate the dropping and creation of the database, which is to be expected given the complexities of the db being in use. Sometimes the process hangs, errors out with "db is currently in use" or just takes too long. I don't care if the db is in use, I want to kill it and create it again. Does some one have a straight shot method to do this? alternatively does anyone have experience dropping all objects in the db instead of dropping the db itself? USE master --Create a database IF EXISTS(SELECT name FROM sys.databases WHERE name = 'mydb') BEGIN ALTER DATABASE mydb SET SINGLE_USER --or RESTRICTED_USER --WITH ROLLBACK IMMEDIATE DROP DATABASE uAbraham_MapSifterAuthority END CREATE DATABASE mydb;

    Read the article

  • Color blindness: Are you aware of it? Do you design for it?

    - by User
    I'm curious whether many of us who do design or take design decisions have ever heard of this problem. I'm aware there are dangerous color combinations, like green + red. This is probably one of the most popular cases of color blindness. If you have green text on a red background and vice versa some people won't see anything. I've also seen in practice that green text on a blue background was not seen by one guy. What other color compositions should be avoided, and how often these cases are to be expected? Let us make some ranging by encounter probability who has the numbers. Addition: I've just remembered one very bad example that causes problems to just about everyone - blue text on a black background. It's unreadable for all intents and purposes. Never could understand what could possibly compel a web master to use this color combination...

    Read the article

  • An Actor "queue" ?

    - by synic
    In Java, to write a library that makes requests to a server, I usually implement some sort of dispatcher (not unlike the one found here in the Twitter4J library: http://github.com/yusuke/twitter4j/blob/master/twitter4j-core/src/main/java/twitter4j/internal/async/DispatcherImpl.java) to limit the number of connections, to perform asynchronous tasks, etc. The idea is that N number of threads are created. A "Task" is queued and all threads are notified, and one of the threads, when it's ready, will pop an item from the queue, do the work, and then return to a waiting state. If all the threads are busy working on a Task, then the Task is just queued, and the next available thread will take it. This keeps the max number of connections to N, and allows at most N Tasks to be operating at the same time. I'm wondering what kind of system I can create with Actors that will accomplish the same thing? Is there a way to have N number of Actors, and when a new message is ready, pass it off to an Actor to handle it - and if all Actors are busy, just queue the message?

    Read the article

  • Can't require local CoffeeScript modules

    - by superlukas
    I'm running Node.js 0.10.21. I tried both CoffeeScript 1.6.3 and master both with and without require('coffee-script/extensions'). Compiling the two files to JavaScript and running them directly in Node works just fine of course. # ./folder/a.coffee require('../b').test() # ./b.coffee exports.test = -> console.log 'yay' # $ coffee folder/a.coffee # # Error: Cannot find module '../b' # at Function.Module._resolveFilename (module.js:338:15) # at Function.Module._load (module.js:280:25) # at Module.require (module.js:364:17) # at require (module.js:380:17) # at Object.<anonymous> (/Users/test/folder/a.coffee:1:1) # at Module._compile (module.js:456:26)

    Read the article

  • Separation of static and dynamic content in Java EE applications

    - by Dan
    We work with IBM products and we typically use IBM Http Servers (read Apache) as a reverse proxy for our application servers. For performance reasons we serve static content (.gif, .jpg, .css, .html etc.) from our http servers, to ease the burden a bit from the application server. So far, we have to distribute files to http server and configure it manually (writing custom scripts at best.) The problem is the effort needed to keep everything in synch, especially when you need to update the app. Does any Java EE product support this “out of the box”? Is there a way to have application server do this automatically, like in cluster configuration for example, where master node is in charge of distributing the application to other nodes and for keeping everything in synch.

    Read the article

  • How can I export images from SQL Server to a file on disk?

    - by rball
    I have a User table that has all of their avatars saved in an image field. I'd like to just take that out of the database and store it as a regular file on disk. I looked around and saw some code for textcopy, but that doesn't seem to be on my machine for some reason. Here is the code I wrote up anyway. Anyone know a way to get this done? DECLARE @exec_str varchar (255) SELECT @exec_str = 'textcopy /S (local)\SQLEXPRESS' + --' /U ' + @login + --' /P ' + @password + ' /D thedatabase' + ' /T User'+ ' /C AvatarImage' + ' /F "d:\Avatars\' + User.Name + '.jpg"' + ' /O' FROM [User] WHERE UserID = 2 EXEC master..xp_cmdshell @exec_str

    Read the article

  • php mail function cannot send to [email protected] ??i

    - by user333216
    I'm having trouble when sending emails thorough the mail() function. I have a script that works perfectly fine for an email address like [email protected] but when the first part of the email is something with a dot like [email protected] it doesn't work and returns this error : Warning: mail() [function.mail ]: SMTP server response: 554 : Recipient address rejected: Relay access denied in confirmed.php on line 119 I am using real email address but have changed it in the above example. Any thoughts - I'm not a php master but surely there is an easy way to send emails to address with a 2 part first section?? Thanks in advance Ali

    Read the article

  • How to solve the leaks when allocating the NSMutableArray in Objective-C

    - by Madan Mohan
    Hi Guys, I am getting leaks in Master view controller of iPhone. When I call this method, I am inserting them into filteredListCount array, because when I search I need to show the list from filteredListCount array otherwise customerArray. This functionality is working fine but I am getting leaks in the method below at allocation: filteredListCount = [[NSMutableArray alloc] initWithCapacity: [customerArray count]]; This is the first view controller of my application, I am showing the list and I am also allowing to search from a list. - (void)parser:(CustomerListLibXmlParser *)parser addCustomerObject:(Customer *)customerObj1 { [UIApplication sharedApplication].networkActivityIndicatorVisible = YES; [customerArray addObject:customerObj1]; filteredListCount = [[NSMutableArray alloc] initWithCapacity: [customerArray count]]; [filteredListCount addObjectsFromArray: customerArray]; [theTableView reloadData]; } - (void)parser:(CustomerListLibXmlParser *)parser encounteredError:(NSError *)error { } - (void)parserFinished:(CustomerListLibXmlParser *)parser { [UIApplication sharedApplication].networkActivityIndicatorVisible = NO; self.title=@"Customers"; }

    Read the article

  • Deleting branches in git causes gitk to go wild

    - by a2h
    I decided to delete a few branches from a (personal project) repository of mine that were merged into master after confirming on #git that leftover branches aren't really necessary. However, gitk's visualisation of my repository's history as a result has been completely screwed up. Basically something like this: With those branches from commits appearing out of nowhere eventually going back into some other commits up ahead. A merge did not occur at all of the points, and I only had around 5 extra branches. Is this normal? Is there any fix for this?

    Read the article

  • Non-string role names in ASP.NET MVC?

    - by MikeWyatt
    ASP.NET MVC has good support for role-based security, but the usage of strings as role names is maddening, simply because they cannot be strongly-typed as enumerations. For example, I have an "Admin" role in my app. The "Admin" string will now exist in the Authorize attribute of my action, in my master page (for hiding a tab), in my database (for defining the roles available to each user), and any other place in my code or view files where I need to perform special logic for admin or non-admin users. Is there a better solution, short of writing my own authorization attribute and filter, that would perhaps deal with a collection of enumeration values?

    Read the article

  • Sharepoint navigation customization

    - by ifunky
    Hi, I've just started to use Sharepoint 2007, MOSS is installed but for the extranet I'm working on the publishing isn't switched on. What I need to do is customise the global navigation so that it displays all top level & subsite links, custom links and format it nicely with css for the brand. Sounds easy but it isn't for some reason! I've been reading numerous blogs and things about it but can't seem to find anything other than writing lots of code (which I don't mind) to hook into the API or use a customer provider which sounds extreme for something so simple. I was wondering what are the options for this and any good links to articles appreciated so I can finally get this simeple task done. I see the MOSS navigation has extra options for global navigation but I presume as this isn't enabled I couldn't use the same code/control in the master page? Thanks Dan

    Read the article

  • Autotools: how to cleanup files created by "./configure" in lighttpd project?

    - by Andi
    Hi all, I'm starting to try out lighttpd for an embedded linux project. I got the curret source package and and started writing a master makefile ecapsulating all configer, compile, install (for testing) etc stuff. And vice-versa I want to cleanup every step. This cleanup should be 100%, i.e. there should be no generated files anymore after cleanup. This is important for repetitive tests. I wonder, is there a way to cleanup all the stuff "./configure" generated? And is make uninstall, make clean, etc. 100%? I don't know the autotools in detail, I'm a beginner in using them. Any hints? Thanx, Andi

    Read the article

  • Completely remove ViewState for specific pages

    - by Kerido
    Hi everybody, I have a site that features some pages which do not require any post-back functionality. They simply display static HTML and don't even have any associated code. However, since the Master Page has a <form runat="server"> tag which wraps all ContentPlaceHolders, the resulting HTML always contains the ViewState field, i.e: <input type="hidden" id="__VIEWSTATE" value="/wEPDwUKMjEwNDQyMTMxM2Rk0XhpfvawD3g+fsmZqmeRoPnb9kI=" /> I realize, that when decrypted, this string corresponds to the <form> tag which I cannot remove. However, I would still like to remove the ViewState field for pages that only display static HTML. Is it possible?

    Read the article

  • How to approach this SQL query

    - by Kim
    I have data related as follows: A table of Houses A table of Boxes (with an FK back into Houses) A table of Things_in_boxes (with an FK back to Boxes) A table of Owners (with an FK back into Houses) In a nutshell, a House has many Boxes, and each Box has many Things in it. In addition, each House has many Owners. If I know two Owners (say Peter and Paul), how can I list all the Things that are in the Boxes that are in the Houses owned by these guys? Also, I'd like to master this SQL stuff. Can anyone recommend a good book/resource? (I'm using MySQL). Thanks!

    Read the article

  • Support both Standard mode and Quirks mode? Is that possible and necessary?

    - by tshao
    Today I was assigned a bug saying that some page elements don't work on IE8 Quirks mode at all, and I need to fix them. The point is that I believe our pages will always be rendered in Standard mode, because we specify DOCTYPE at the beginning of every page (via master page). I'd think it must be some debugging tools changed that during testing. I managed to convice QA to close it as by design, after a brief explanation to her. Now I start to think the question that whether we should have our page work on both Standard and Quirks mode. Maybe we should try to minimize the problem even if the page is not rendering in a supposed mode? Any standard or best practices on that? Thanks!

    Read the article

  • How to retrieve all errors and messages from a query using ADO

    - by Johan Levin
    When a SQL batch returns more than one message from e.g. print statements, then I can only retrieve the first one using the ADO connection's Errors collection. How do I get the rest of the messages? If I run this script: Option Explicit Dim conn Set conn = CreateObject("ADODB.Connection") conn.Provider = "SQLOLEDB" conn.ConnectionString = "Data Source=(local);Integrated Security=SSPI;Initial Catalog=Master" conn.Open conn.Execute("print 'Foo'" & vbCrLf & "print 'Bar'" & vbCrLf & "raiserror ('xyz', 10, 127)") Dim error For Each error in conn.Errors MsgBox error.Description Next Then I only get "Foo" back, never "Bar" or "xyz". Is there a way to get the remaining messages?

    Read the article

  • Publish web application from MSBuild Script using VS2010 targets resets working directory

    - by Raoul
    I am trying to automatically publish and deploy my .Net 4 web application automatically from a build script to be run by our continuous integration server. I am using the new _WPPCopyWebApplication target from VS2010 to perform the publish, however it appears to reset the current working directory of the msbuild project to c:\ this causes my prebuild steps to fail as they have relative paths to some external tools. The task I am running from our master.build file is as follows: <Target Name="PublishWeb"> <MSBuild Projects="$(ProjectPath)" Targets="ResolveReferences;_WPPCopyWebApplication" Properties="WebProjectOutputDir=$(DeployPath);OutDir=$(TempOutputFolder)$(WebOutputFolder)\;OutputPath=$(ProjectPath)\bin\Debug;" /> </Target> This does not happen when using the legacy _CopyWebApplication. Does anyone have any idea how to resolve this problem?

    Read the article

  • Embarrassingly parallel workflow creates too many output files

    - by Hooked
    On a Linux cluster I run many (N > 10^6) independent computations. Each computation takes only a few minutes and the output is a handful of lines. When N was small I was able to store each result in a separate file to be parsed later. With large N however, I find that I am wasting storage space (for the file creation) and simple commands like ls require extra care due to internal limits of bash: -bash: /bin/ls: Argument list too long. Each computation is required to run through a qsub scheduling algorithm so I am unable to create a master program which simply aggregates the output data to a single file. The simple solution of appending to a single fails when two programs finish at the same time and interleave their output. I have no admin access to the cluster, so installing a system-wide database is not an option. How can I collate the output data from embarrassingly parallel computation before it gets unmanageable?

    Read the article

  • Improving HTML scrapper efficiency with pcntl_fork()

    - by Michael Pasqualone
    With the help from two previous questions, I now have a working HTML scrapper that feeds product information into a database. What I am now trying to do is improve efficiently by wrapping my brain around with getting my scrapper working with pcntl_fork. If I split my php5-cli script into 10 separate chunks, I improve total runtime by a large factor so I know I am not i/o or cpu bound but just limited by the linear nature of my scraping functions. Using code I've cobbled together from multiple sources, I have this working test: <?php libxml_use_internal_errors(true); ini_set('max_execution_time', 0); ini_set('max_input_time', 0); set_time_limit(0); $hrefArray = array("http://slashdot.org", "http://slashdot.org", "http://slashdot.org", "http://slashdot.org"); function doDomStuff($singleHref,$childPid) { $html = new DOMDocument(); $html->loadHtmlFile($singleHref); $xPath = new DOMXPath($html); $domQuery = '//div[@id="slogan"]/h2'; $domReturn = $xPath->query($domQuery); foreach($domReturn as $return) { $slogan = $return->nodeValue; echo "Child PID #" . $childPid . " says: " . $slogan . "\n"; } } $pids = array(); foreach ($hrefArray as $singleHref) { $pid = pcntl_fork(); if ($pid == -1) { die("Couldn't fork, error!"); } elseif ($pid > 0) { // We are the parent $pids[] = $pid; } else { // We are the child $childPid = posix_getpid(); doDomStuff($singleHref,$childPid); exit(0); } } foreach ($pids as $pid) { pcntl_waitpid($pid, $status); } // Clear the libxml buffer so it doesn't fill up libxml_clear_errors(); Which raises the following questions: 1) Given my hrefArray contains 4 urls - if the array was to contain say 1,000 product urls this code would spawn 1,000 child processes? If so, what is the best way to limit the amount of processes to say 10, and again 1,000 urls as an example split the child work load to 100 products per child (10 x 100). 2) I've learn that pcntl_fork creates a copy of the process and all variables, classes, etc. What I would like to do is replace my hrefArray variable with a DOMDocument query that builds the list of products to scrape, and then feeds them off to child processes to do the processing - so spreading the load across 10 child workers. My brain is telling I need to do something like the following (obviously this doesn't work, so don't run it): <?php libxml_use_internal_errors(true); ini_set('max_execution_time', 0); ini_set('max_input_time', 0); set_time_limit(0); $maxChildWorkers = 10; $html = new DOMDocument(); $html->loadHtmlFile('http://xxxx'); $xPath = new DOMXPath($html); $domQuery = '//div[@id=productDetail]/a'; $domReturn = $xPath->query($domQuery); $hrefsArray[] = $domReturn->getAttribute('href'); function doDomStuff($singleHref) { // Do stuff here with each product } // To figure out: Split href array into $maxChilderWorks # of workArray1, workArray2 ... workArray10. $pids = array(); foreach ($workArray(1,2,3 ... 10) as $singleHref) { $pid = pcntl_fork(); if ($pid == -1) { die("Couldn't fork, error!"); } elseif ($pid > 0) { // We are the parent $pids[] = $pid; } else { // We are the child $childPid = posix_getpid(); doDomStuff($singleHref); exit(0); } } foreach ($pids as $pid) { pcntl_waitpid($pid, $status); } // Clear the libxml buffer so it doesn't fill up libxml_clear_errors(); But what I can't figure out is how to build my hrefsArray[] in the master/parent process only and feed it off to the child process. Currently everything I've tried causes loops in the child processes. I.e. my hrefsArray gets built in the master, and in each subsequent child process. I am sure I am going about this all totally wrong, so would greatly appreciate just general nudge in the right direction.

    Read the article

  • Flexibility starting to develop applications and DB

    - by kristaps
    Hi! First of all I want to say some info for admins. This post can help to all DBA and developers. Hope You don't close this. I'm writing about widespread problem I'm writing master work and I want that You will had little time to answer my questions in my form. These questions are about db structures. Visit here: http://spreadsheets.google.com/embeddedform?formkey=dEhmOWFGb2twWVFreWJwWXp2U3c5a1E6MQ After I summarized results, I insert it in new post and developers can choose which structure to use to reach needed requirements developing new applications. I think that my test will help a lot of people. Best regards, Kristaps

    Read the article

  • How do I change a property's value based on a conditional in msbuild?

    - by Noel Kennedy
    I would like to change the value of a property if it is a certain value. In C# I would write: if(x=="NotAllowed") x="CorrectedValue; This is what I have so far, please don't laugh: <PropertyGroup> <BranchName>BranchNameNotSet</BranchName> </PropertyGroup> ///Other targets set BranchName <Target Name="CheckPropertiesHaveBeenSet"> <Error Condition="$(BranchName)==BranchNameNotSet" Text="Something has gone wrong.. branch name not entered"/> <When Condition="$(BranchName)==master"> <PropertyGroup> <BranchName>MasterBranch</BranchName> </PropertyGroup> </When> </Target>

    Read the article

  • Create Access databases programatically through vb.net

    - by user87225
    Let me preface this by saying that I know this is a stupid way to go about this, but it needs to be handled in this way. I need to make an application that from a master database creates a number of access database files (tables of a larger db), then these are manually given to users who fill in data, the database files are emailed back to a user who, through the application, combines them again. The only part of this that I am unsure about is problematically creating the access db's. I have read that through Microsoft Jet OLE DB Provider and Microsoft ADO Ext I can create them (the tables and data), but I also need forms. I have yet to start writing anything and this is away from my area of work, so any insight/links would be much appreciated. Also, I would hope to be able to write this in the free express version of visual studio. Are there components needed that would prevent me from this? Thanks.

    Read the article

  • Suggest a open source project which heavily uses java concurrency utilities?

    - by user49767
    I have done good amount of Java programming, but yet to master Threading & Concurrency. I would like to become an expert programmer in threading & concurrency. I have also took a short at Tomcat code, I was able to understand, but looking even more complex project. Could you suggest any open source project which heavily uses java threading & concurrency utilities? Note : I have also reading java.util.concurrent package source code, but eager to learn from Application perspective, than creating my own threading utilities.

    Read the article

  • Multiple RadUpload Control in One Page

    - by user159771
    I have an aspx page that uses master page. In the papx page, I load a user control containing 2 RadUpload controls (Rad1 and Rad2). User can choose to upload the file either using the first RadUpload or the second RadUpload and there is certain validation applied for each RadUpload. The weird thing happened is that when I upload file using Rad2 (second RadUpload), the RadUpload.UploadedFiles for the first RadUpload is there (count = 1). Instead of the file being uploaded by Rad2, it is detected as if it is uploaded from Rad1, so my validation failed. Does someone encounter this problem before? This is a very weird thing and I've spent almost 1 and a half day fixing this without knowing what happened to the control

    Read the article

  • Save in Sessions to reduce database load

    - by Kovu
    at the moment I try to reduce the load on my database extremly, so I had a look in my website and think about - what database calls can I try to avoid. So is there a rule for that? Sould I save every information in a Session that is nearly never changed? e.g.: The User-Table is a 35-coloumn-table which I need so often in so different ways, that in the moment I got this user-object at nearly every PageLoad AND in the master-site-page-load (Settings, display the username for a welcome message, colors etc etc.). So is that good to avoid the database query here, save the User-Object in a Session and call it from the session - and of course destroy the session whereever the User-Object get changed (e.g. User change his settings)?

    Read the article

< Previous Page | 147 148 149 150 151 152 153 154 155 156 157 158  | Next Page >