Search Results

Search found 3987 results on 160 pages for 'batch rename'.

Page 146/160 | < Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >

  • deleted gen folder, eclipse isn't generating it now :(

    - by LuxuryMode
    I accidentally deleted my gen folder and now, predictably, my resources are all messed up. I just created a gen folder myself and tried to project clean - that didn't work. Tried right-clicking project and going to android tools fix project properties - didn't work. Tried unchecking build automatically...didn't work. cleaned, closed project, closed eclipse, restarted, etc, etc. Nothing is working and I keep seeing this error: gen already exists but is not a source folder. Convert to a source folder or rename it. EDIT - OK was able to generate R.java, but now I'm getting crazy stuff in the console: [2011-06-14 17:06:11 - fastapp] Conversion to Dalvik format failed with error 1 [2011-06-14 17:06:42 - fastapp] Dx trouble processing "java/awt/font/NumericShaper.class": Ill-advised or mistaken usage of a core class (java.* or javax.*) when not building a core library. This is often due to inadvertently including a core library file in your application's project, when using an IDE (such as Eclipse). If you are sure you're not intentionally defining a core class, then this is the most likely explanation of what's going on. However, you might actually be trying to define a class in a core namespace, the source of which you may have taken, for example, from a non-Android virtual machine project. This will most assuredly not work. At a minimum, it jeopardizes the compatibility of your app with future versions of the platform. It is also often of questionable legality. If you really intend to build a core library -- which is only appropriate as part of creating a full virtual machine distribution, as opposed to compiling an application -- then use the "--core-library" option to suppress this error message. If you go ahead and use "--core-library" but are in fact building an application, then be forewarned that your application will still fail to build or run, at some point. Please be prepared for angry customers who find, for example, that your application ceases to function once they upgrade their operating system. You will be to blame for this problem. If you are legitimately using some code that happens to be in a core package, then the easiest safe alternative you have is to repackage that code. That is, move the classes in question into your own package namespace. This means that they will never be in conflict with core system classes. JarJar is a tool that may help you in this endeavor. If you find that you cannot do this, then that is an indication that the path you are on will ultimately lead to pain, suffering, grief, and lamentation. [2011-06-14 17:06:42 - fastapp] Dx 1 error; aborting [2011-06-14 17:06:42 - fastapp] Conversion to Dalvik format failed with error 1 And eclipse can't resolve the import of my resources import com.me.fastapp.R;

    Read the article

  • flashcache with mdadm and LVM

    - by Backtogeek
    I am having trouble setting up flashcache on a system with LVM and mdadm, I suspect I am either just missing an obvious step or getting some mapping wrong and hoped someone could point me in the right direction? system info: CentOS 6.4 64 bit mdadm config md0 : active raid1 sdd3[2] sde3[3] sdf3[4] sdg3[5] sdh3[1] sda3[0] 204736 blocks super 1.0 [6/6] [UUUUUU] md2 : active raid6 sdd5[2] sde5[3] sdf5[4] sdg5[5] sdh5[1] sda5[0] 3794905088 blocks super 1.1 level 6, 512k chunk, algorithm 2 [6/6] [UUUUUU] md3 : active raid0 sdc1[1] sdb1[0] 250065920 blocks super 1.1 512k chunks md1 : active raid10 sdh1[1] sda1[0] sdd1[2] sdf1[4] sdg1[5] sde1[3] 76749312 blocks super 1.1 512K chunks 2 near-copies [6/6] [UUUUUU] pcsvan PV /dev/mapper/ssdcache VG Xenvol lvm2 [3.53 TiB / 3.53 TiB free] Total: 1 [3.53 TiB] / in use: 1 [3.53 TiB] / in no VG: 0 [0 ] flashcache create command used: flashcache_create -p back ssdcache /dev/md3 /dev/md2 pvdisplay --- Physical volume --- PV Name /dev/mapper/ssdcache VG Name Xenvol PV Size 3.53 TiB / not usable 106.00 MiB Allocatable yes PE Size 128.00 MiB Total PE 28952 Free PE 28912 Allocated PE 40 PV UUID w0ENVR-EjvO-gAZ8-TQA1-5wYu-ISOk-pJv7LV vgdisplay --- Volume group --- VG Name Xenvol System ID Format lvm2 Metadata Areas 1 Metadata Sequence No 2 VG Access read/write VG Status resizable MAX LV 0 Cur LV 1 Open LV 1 Max PV 0 Cur PV 1 Act PV 1 VG Size 3.53 TiB PE Size 128.00 MiB Total PE 28952 Alloc PE / Size 40 / 5.00 GiB Free PE / Size 28912 / 3.53 TiB VG UUID 7vfKWh-ENPb-P8dV-jVlb-kP0o-1dDd-N8zzYj So that is where I am at, I thought that was the job done however when creating a logical volume called test and mounting it is /mnt/test the sequential write is pathetic, 60 ish MB/s /dev/md3 has 2 x SSD's in Raid0 which alone is performing at around 800 MB/s sequential write and I am trying to cache /dev/md2 which is 6 x 1TB drives in raid6 I have read a number of pages through the day and some of them here, it is obvious from the results that the cache is not functioning but I am unsure why. I have added the filter line in the lvm.conf filter = [ "r|/dev/sdb|", "r|/dev/sdc|", "r|/dev/md3|" ] It is probably something silly but the cache is clearly performing no writes so I suspect I am not mapping it or have not mounted the cache correctly. dmsetup status ssdcache: 0 7589810176 flashcache stats: reads(142), writes(0) read hits(133), read hit percent(93) write hits(0) write hit percent(0) dirty write hits(0) dirty write hit percent(0) replacement(0), write replacement(0) write invalidates(0), read invalidates(0) pending enqueues(0), pending inval(0) metadata dirties(0), metadata cleans(0) metadata batch(0) metadata ssd writes(0) cleanings(0) fallow cleanings(0) no room(0) front merge(0) back merge(0) force_clean_block(0) disk reads(9), disk writes(0) ssd reads(133) ssd writes(9) uncached reads(0), uncached writes(0), uncached IO requeue(0) disk read errors(0), disk write errors(0) ssd read errors(0) ssd write errors(0) uncached sequential reads(0), uncached sequential writes(0) pid_adds(0), pid_dels(0), pid_drops(0) pid_expiry(0) lru hot blocks(31136000), lru warm blocks(31136000) lru promotions(0), lru demotions(0) Xenvol-test: 0 10485760 linear I have included as much info as I can think of, look forward to any replies.

    Read the article

  • Problem with files consists of spaces and single quotes?

    - by Vijay
    I'using the following code to create thumbnails using ffmpeg but it was working fine for the files which have no spaces or any quotes.. But when the file has a space (like 'sachin knock.flv') or files which have quotes (like sachin's_double_cent.mp4) it doesn't work.. What can i do to get those files work accurately? One restriction is that i can't rename files as they are lump some.. My code is <?php error_reporting(E_ALL); extension_loaded('ffmpeg') or die('Error in loading ffmpeg'); $link = mysql_connect('localhost', 'root', ''); if (!$link) { die('Not connected : ' . mysql_error()); } $db_selected = mysql_select_db('db', $link); $max_width = 120; $max_height = 72; $path ="/home/rootuser/public_html/temp/"; $qry="select id, input_file, output_file from videos where thumbnail='' or thumbnail is null;"; $res=mysql_query($qry); while($row = mysql_fetch_array($res,MYSQL_ASSOC)) { $orig_str = array(" "); $rep_str = array("\ "); $outfile = $row[output_file]; // $infile = $row[input_file]; $infile1 = str_replace($orig_str, $rep_str, $outfile); $tmp = explode(".",$infile1); $tmp_name = $tmp[0]; $imgname = $tmp_name.".png"; $srcfile = "/home/rootuser/public_html/uploaded_vids/".$outfile; echo exec("ffmpeg -i ".$srcfile." -r 1 -ss 00:00:05 -f image2 -s 120x72 ".$path.$imgname); $nname = "./temp/".$imgname; $fileo = fopen($nname,"rb"); if($fileo) { $imgData = addslashes(file_get_contents($nname)); echo $imgdata; $qryy="update videos set thumbnail='{$imgData}' where input_file='$outfile'"; $ress=mysql_query($qryy); } else echo "Could not open<br><br>"; unlink('$nname'); } ?>

    Read the article

  • How to listen for file system changes MAC - kFSEventStreamCreateFlagWatchRoot

    - by Cocoa Newbie
    Hi All, I am listening for Directory and disk changes in a COCOA project using FSEvents. I need to get events when a root folder is renamed or deleted. So, I passed kFSEventStreamCreateFlagWatchRoot while creating the FSEventStream.But even if I delete or rename the root folder I am not getting corresponding FSEventStreamEventFlags. Any idea what could possibly be the issue. I am listening for changes in a USB mounted device. I used both FSEventStreamCreate and FSEventStreamCreateRelativeToDevice. One thing I notices is when I try with FSEventStreamCreate I get the following error message while creating FSEventStream: (CarbonCore.framework) FSEventStreamCreate: watch_all_parents: error trying to add kqueue for fd 7 (/Volumes/NO NAME; Operation not supported) But with FSEventStreamCreateRelativeToDevice there are no errors but still not getting kFSEventStreamEventFlagRootChanged in event flags. Also, while creation using FSEventStreamCreateRelativeToDevice apple say's if I want to listen to root path changes pass emty string "". But I am not able to listen to root path changes by passing empty string. But when I pass "/" it works. But even for "/" I do not get any proper FSEventStreamEventFlags. I am pasting the code here: -(void) subscribeFileSystemChanges:(NSString*) path { PRINT_FUNCTION_BEGIN; // if already subscribed then unsubscribe if (stream) { FSEventStreamStop(stream); FSEventStreamInvalidate(stream); /* will remove from runloop */ FSEventStreamRelease(stream); } FSEventStreamContext cntxt = {0}; cntxt.info = self; CFArrayRef pathsToWatch = CFArrayCreate(NULL, (const void**)&path, 1, NULL); stream = FSEventStreamCreate(NULL, &feCallback, &cntxt, pathsToWatch, kFSEventStreamEventIdSinceNow, 1, kFSEventStreamCreateFlagWatchRoot ); FSEventStreamScheduleWithRunLoop(stream, CFRunLoopGetCurrent(), kCFRunLoopDefaultMode); FSEventStreamStart(stream); } call back function: static void feCallback(ConstFSEventStreamRef streamRef, void* pClientCallBackInfo, size_t numEvents, void* pEventPaths, const FSEventStreamEventFlags eventFlags[], const FSEventStreamEventId eventIds[]) {? char** ppPaths = (char**)pEventPaths; int i; for (i = 0; i < numEvents; i++) { NSLog(@"Event Flags %lu Event Id %llu", eventFlags[i], eventIds[i]); NSLog(@"Path changed: %@", [NSString stringWithUTF8String:ppPaths[i]]); } } Thanks a lot in advance.

    Read the article

  • Using IOperationBehavior to supply a WCF parameter

    - by Chris Kemp
    This is my first step into the world of stackoverflow, so apologies if I cock anything up. I'm trying to create a WCF Operation which has a parameter that is not exposed to the outside world, but is instead automatically passed into the function. So the world sees this: int Add(int a, int b) But it is implemented as: int Add(object context, int a, int b) Then, the context gets supplied by the system at run-time. The example I'm working with is completely artificial, but mimics something that I'm looking into in a real-world scenario. I'm able to get close, but not quite the whole way there. First off, I created a simple method and wrote an application to confirm it works. It does. It returns a + b and writes the context as a string to my debug. Yay. [OperationContract] int Add(object context, int a, int b); I then wrote the following code: public class SupplyContextAttribute : Attribute, IOperationBehavior { public void Validate(OperationDescription operationDescription) { if (!operationDescription.Messages.Any(m => m.Body.Parts.First().Name == "context")) throw new FaultException("Parameter 'context' is missing."); } public void ApplyDispatchBehavior(OperationDescription operationDescription, DispatchOperation dispatchOperation) { dispatchOperation.Invoker = new SupplyContextInvoker(dispatchOperation.Invoker); } public void ApplyClientBehavior(OperationDescription operationDescription, ClientOperation clientOperation) { } public void AddBindingParameters(OperationDescription operationDescription, BindingParameterCollection bindingParameters) { // Remove the 'context' parameter from the inbound message operationDescription.Messages[0].Body.Parts.RemoveAt(0); } } public class SupplyContextInvoker : IOperationInvoker { readonly IOperationInvoker _invoker; public SupplyContextInvoker(IOperationInvoker invoker) { _invoker = invoker; } public object[] AllocateInputs() { return _invoker.AllocateInputs().Skip(1).ToArray(); } private object[] IntroduceContext(object[] inputs) { return new[] { "MyContext" }.Concat(inputs).ToArray(); } public object Invoke(object instance, object[] inputs, out object[] outputs) { return _invoker.Invoke(instance, IntroduceContext(inputs), out outputs); } public IAsyncResult InvokeBegin(object instance, object[] inputs, AsyncCallback callback, object state) { return _invoker.InvokeBegin(instance, IntroduceContext(inputs), callback, state); } public object InvokeEnd(object instance, out object[] outputs, IAsyncResult result) { return _invoker.InvokeEnd(instance, out outputs, result); } public bool IsSynchronous { get { return _invoker.IsSynchronous; } } } And my WCF operation now looks like this: [OperationContract, SupplyContext] int Amend(object context, int a, int b); My updated references no longer show the 'context' parameter, which is exactly what I want. The trouble is that whenver I run the code, it gets past the AllocateInputs and then throws an Index was outside the bounds of the Array. error somewhere in the WCF guts. I've tried other things, and I find that I can successfully change the type of the parameter and rename it and have my code work. But the moment I remove the parameter it falls over. Can anyone give me some idea of how to get this to work (or if it can be done at all).

    Read the article

  • Random Computer Crashes

    - by Josh W.
    Ok, here's a wierd one for you all. Occasionally my PC here at work will crash in a very peculiar way. My dual monitors will suddenly go blank as if there is no longer a video signal, the USB mouse light will go dark and mouse stays unresponsive, the keyboard lights will not change status when the appropriate keys are pressed (Num/Caps/Scoll Lock). The CD Tray WILL open and close. But the computer will not respond to a ping request. For all intents & purposes it's as if the computer is off, except it wasn't intentional by me. The power light and internal fans are still on and I've now lost any unsaved work. Now here's where it gets wierd. This PC is part of a batch of PC's we got from a local vendor who does our initial system builds. Mine, and 6 other co-workers PCs all have the same issue. Originally we thought it was a bad combination of hardware, but through trial and error the only thing we haven't eliminated are the OS, Mobo & CPU. The problem was so bad for some of them that they ended up going back to their 5 year old dinosaurs in order to get some work done, for me the problem isn't as bad, maybe once every other day or so, but still enough to bite me in the ass if I've forgotten my ritualistic pressing of CTRL-S every 1-5 minutes. In this case we've tried two different video cards, two different power supplies, two different memory configurations, running on a UPS/not on UPS, updating/rolling back video drivers, three different bios revisions. The only things we haven't swapped are the mobo & cpu, mainly because a new mobo means a new Hardware Abstraction Layer, ie re-install of windows and there's alot of other software on this PC that takes forever to reload by hand. There was a base image that our systems team created with all the drivers installed and the basic setup of software our company uses, but they then must customize the setup for us programmers so it takes a while to get a new configuration up and going. I'm a programmer by day and am usually pretty good at diagnosing computer problems whether through trial and error or not. We've pretty much exhausted all the ideas we can think of here, short of a new mobo/cpu. Was hoping someone out there might have anything else we can try.. Relevant Parts: OS: XP Pro 32-bit Motherboard: Intel DG41RQ CPU: Intel Core-2 Quad Q9400 @ 2.66GHz Current BIOS Version/Date Intel Corp. RQG4110H.86A.0014.2010.0306.1151, 3/6/2010 Dual LCD's, Viewsonic VG930m & Samsung SyncMaster 910v (other people have different models, but listed in case there's some very wierd problem with the signals being sent/received) PS/2 Keyboard USB Microsoft Intellimouse BIOS Versions Tried: R 0013 12/23/2009 R 0014 3/6/2010 Video cards Tried GeForce 8400 GS Radeon HD 4350 - ASUS EAH4350 Two Different Power Supplies a 380W & 550W Ram Configurations 2GB - 1 x 2GB 4GB - 2 x 2GB

    Read the article

  • What to name 2 methods with same signatures

    - by coffeeaddict
    Initially I had a method in our DL that would take in the object it's updating like so: internal void UpdateCash(Cash Cash) { using (OurCustomDbConnection conn = CreateConnection("UpdateCash")) { conn.CommandText = @"update Cash set captureID = @captureID, ac_code = @acCode, captureDate = @captureDate, errmsg = @errorMessage, isDebit = @isDebit, SourceInfoID = @sourceInfoID, PayPalTransactionInfoID = @payPalTransactionInfoID, CreditCardTransactionInfoID = @CreditCardTransactionInfoID where id = @cashID"; conn.AddParam("@captureID", cash.CaptureID); conn.AddParam("@acCode", cash.ActionCode); conn.AddParam("@captureDate", cash.CaptureDate); conn.AddParam("@errorMessage", cash.ErrorMessage); conn.AddParam("@isDebit", cyberCash.IsDebit); conn.AddParam("@PayPalTransactionInfoID", cash.PayPalTransactionInfoID); conn.AddParam("@CreditCardTransactionInfoID", cash.CreditCardTransactionInfoID); conn.AddParam("@sourceInfoID", cash.SourceInfoID); conn.AddParam("@cashID", cash.Id); conn.ExecuteNonQuery(); } } My boss felt that creating an object every time just to update one or two fields is overkill. But I had a couple places in code using this. He recommended using just UpdateCash and sending in the ID for CAsh and field I want to update. Well the problem is I have 2 places in code using my original method. And those 2 places are updating 2 completely different fields in the Cash table. Before I was just able to get the existing Cash record and shove it into a Cash object, then update the properties I wanted to be updated in the DB, then send back the cash object to my method above. I need some advice on what to do here. I have 2 methods and they have the same signature. I'm not quite sure what to rename these because both are updating 2 completely different fields in the Cash table: internal void UpdateCash(int cashID, int paypalCaptureID) { using (OurCustomDbConnection conn = CreateConnection("UpdateCash")) { conn.CommandText = @"update Cash set CaptureID = @paypalCaptureID where id = @cashID"; conn.AddParam("@captureID", paypalCaptureID); conn.ExecuteNonQuery(); } } internal void UpdateCash(int cashID, int PayPalTransactionInfoID) { using (OurCustomDbConnection conn = CreateConnection("UpdateCash")) { conn.CommandText = @"update Cash set PaymentSourceID = @PayPalTransactionInfoID where id = @cashID"; conn.AddParam("@PayPalTransactionInfoID", PayPalTransactionInfoID); conn.ExecuteNonQuery(); } } So I thought hmm, maybe change the names to these so that they are now unique and somewhat explain what field its updating: UpdateCashOrderID UpdateCashTransactionInfoID ok but that's not really very good names. And I can't go too generic, for example: UpdateCashTransaction(int cashID, paypalTransactionID) What if we have different types of transactionIDs that the cash record holds besides just the paypalTransactionInfoID? such as the creditCardInfoID? Then what? Transaction doesn't tell me what kind. And furthermore what if you're updating 2 fields so you have 2 params next to the cashID param: UpdateCashTransaction(int cashID, paypalTransactionID, someOtherFieldIWantToUpdate) see my frustration? what's the best way to handle this is my boss doesn't like my first route?

    Read the article

  • CMake: Mac OS X: ld: unknown option: -soname

    - by Alex Ivasyuv
    I try to build my app with CMake on Mac OS X, I get the following error: Linking CXX shared library libsml.so ld: unknown option: -soname collect2: ld returned 1 exit status make[2]: *** [libsml.so] Error 1 make[1]: *** [CMakeFiles/sml.dir/all] Error 2 make: *** [all] Error 2 This is strange, as Mac has .dylib extension instead of .so. There's my CMakeLists.txt: cmake_minimum_required(VERSION 2.6) PROJECT (SilentMedia) SET(SourcePath src/libsml) IF (DEFINED OSS) SET(OSS_src ${SourcePath}/Media/Audio/SoundSystem/OSS/DSP/DSP.cpp ${SourcePath}/Media/Audio/SoundSystem/OSS/Mixer/Mixer.cpp ) ENDIF(DEFINED OSS) IF (DEFINED ALSA) SET(ALSA_src ${SourcePath}/Media/Audio/SoundSystem/ALSA/DSP/DSP.cpp ${SourcePath}/Media/Audio/SoundSystem/ALSA/Mixer/Mixer.cpp ) ENDIF(DEFINED ALSA) SET(SilentMedia_src ${SourcePath}/Utils/Base64/Base64.cpp ${SourcePath}/Utils/String/String.cpp ${SourcePath}/Utils/Random/Random.cpp ${SourcePath}/Media/Container/FileLoader.cpp ${SourcePath}/Media/Container/OGG/OGG.cpp ${SourcePath}/Media/PlayList/XSPF/XSPF.cpp ${SourcePath}/Media/PlayList/XSPF/libXSPF.cpp ${SourcePath}/Media/PlayList/PlayList.cpp ${OSS_src} ${ALSA_src} ${SourcePath}/Media/Audio/Audio.cpp ${SourcePath}/Media/Audio/AudioInfo.cpp ${SourcePath}/Media/Audio/AudioProxy.cpp ${SourcePath}/Media/Audio/SoundSystem/SoundSystem.cpp ${SourcePath}/Media/Audio/SoundSystem/libao/AO.cpp ${SourcePath}/Media/Audio/Codec/WAV/WAV.cpp ${SourcePath}/Media/Audio/Codec/Vorbis/Vorbis.cpp ${SourcePath}/Media/Audio/Codec/WavPack/WavPack.cpp ${SourcePath}/Media/Audio/Codec/FLAC/FLAC.cpp ) SET(SilentMedia_LINKED_LIBRARY sml vorbisfile FLAC++ wavpack ao #asound boost_thread-mt boost_filesystem-mt xspf gtest ) INCLUDE_DIRECTORIES( /usr/include /usr/local/include /usr/include/c++/4.4 /Users/alex/Downloads/boost_1_45_0 ${SilentMedia_SOURCE_DIR}/src ${SilentMedia_SOURCE_DIR}/${SourcePath} ) #link_directories( # /usr/lib # /usr/local/lib # /Users/alex/Downloads/boost_1_45_0/stage/lib #) IF(LibraryType STREQUAL "static") ADD_LIBRARY(sml-static STATIC ${SilentMedia_src}) # rename library from libsml-static.a => libsml.a SET_TARGET_PROPERTIES(sml-static PROPERTIES OUTPUT_NAME "sml") SET_TARGET_PROPERTIES(sml-static PROPERTIES CLEAN_DIRECT_OUTPUT 1) ELSEIF(LibraryType STREQUAL "shared") ADD_LIBRARY(sml SHARED ${SilentMedia_src}) # change compile optimization/debug flags # -Werror -pedantic IF(BuildType STREQUAL "Debug") SET_TARGET_PROPERTIES(sml PROPERTIES COMPILE_FLAGS "-pipe -Wall -W -ggdb") ELSEIF(BuildType STREQUAL "Release") SET_TARGET_PROPERTIES(sml PROPERTIES COMPILE_FLAGS "-pipe -Wall -W -O3 -fomit-frame-pointer") ENDIF() SET_TARGET_PROPERTIES(sml PROPERTIES CLEAN_DIRECT_OUTPUT 1) ENDIF() ### TEST ### IF(Test STREQUAL "true") ADD_EXECUTABLE (bin/TestXSPF ${SourcePath}/Test/Media/PlayLists/XSPF/TestXSPF.cpp) TARGET_LINK_LIBRARIES (bin/TestXSPF ${SilentMedia_LINKED_LIBRARY}) ADD_EXECUTABLE (bin/test1 ${SourcePath}/Test/test.cpp) TARGET_LINK_LIBRARIES (bin/test1 ${SilentMedia_LINKED_LIBRARY}) ADD_EXECUTABLE (bin/TestFileLoader ${SourcePath}/Test/Media/Container/FileLoader/TestFileLoader.cpp) TARGET_LINK_LIBRARIES (bin/TestFileLoader ${SilentMedia_LINKED_LIBRARY}) ADD_EXECUTABLE (bin/testMixer ${SourcePath}/Test/testMixer.cpp) TARGET_LINK_LIBRARIES (bin/testMixer ${SilentMedia_LINKED_LIBRARY}) ENDIF (Test STREQUAL "true") ### TEST ### ADD_CUSTOM_TARGET(doc COMMAND doxygen ${SilentMedia_SOURCE_DIR}/doc/Doxyfile) There was no error on Linux. Build process: cmake -D BuildType=Debug -D LibraryType=shared . make I found, that incorrect command generate in CMakeFiles/sml.dir/link.txt. But why, as the goal of CMake is cross-platforming.. How to fix it?

    Read the article

  • How do I (robustly) remotely execute tasks on Windows workstations in a domain?

    - by Zac B
    I'm not even sure if "robustly" is a word. Anyway. Context: We have a few hundred Windows 7 workstations on a LAN. We use AD/GPO management pretty heavily, but there are a lot of periodic and/or manual maintenance tasks we need to do that can't be done via GPO/scheduled task. For example, say I want to execute program X (which runs silently, in the background, and doesn't bother the user) on workstation Y, or say I want to execute task A on a workstation group B either on a schedule or on demand. Kicking the users off of their computers to do this (i.e. using RDP) is a no-no, and doesn't work on groups anyway. Question: What's the best way to do this that is robust enough that, after setup, I could give it to beginner support people (read: people who are phobic of the command line, and get confused with GUI interfaces more complicated than Firefox)? I'm a competent programmer, and, if there is a robust set of tools or framework out there for this type of task, I'd consider hacking something together myself if it didn't take too long. If there's some combination of tools or techniques that others use to make remote-workstation-administration doable by beginners, I have yet to find it. For those who care about the "why": I'm midlevel IT, and was told to implement a remote management solution that allows arbitrary/scheduled remote execution, with confirmation that programs actually ran remotely, and the ability to view what they returned. "Why?" I asked, "Can't I just use PsExec and the task scheduler on a dispatcher machine?" "No," I was told, "'Joe' the second-week tech is going to be in charge of this one, and he needs something simple with a GUI." What I've tried: I've played with making a bunch of one-clickable "transfer files to remote computer and run them with PsExec" batch/VB scrips, but those tend to break down and don't easily support running on customizable groups. I've played a little bit with the Windows version of Puppet, but it doesn't support arbitrary-time remote execution (it's ability to group computers into a tree/node structure is really nice though). I've used an older version of Altiris, and, while it does a lot of what I want, it's interface is awful, it's slow, crashes a lot, and is probably too expensive for management. SwiftWater's DMS solution does some of what I want, but it's very underdeveloped, closed-source (not a deal breaker but not ideal), and I get the impression that support and reliability are lacking.

    Read the article

  • Call phpexcel from joomla

    - by Oscar Calderon
    i have a problem about phpexcel and joomla. I'm developing some filter form to load excel reports, so i used phpexcel library to do this. Right now i have only a report, it works fine, but after that i upload inside joomla using PHP pages component that allows me to put php files inside joomla and call it. When i put them, i change a little bit the form that calls the php that generates the excel report, i call the php using a link like this: h**p://www.whiblix.com/index.php?option=com_php&Itemid=24 That is, calling it from Joomla, not directly the php. If i wanna call the php directly i could use this path: h**p://www.whiblix.com/components/com_php/files/repImportaciones.php What's the problem? The problem is, when i call the php that generates the excel through joomla, the excel that is downloaded is corrupt and only shows symbols in one cell when i open it. But if i call the php directly the report is generated fine. I could call the php directly, the problem is that if i call it directly i can't use this line of code: defined( '_JEXEC' ) or die( 'Restricted access' ); That is used to deny the direct access to php from call it directly, because it doesn' work because the security. Where's the problem? This is the code of php that generates the report (ommiting the code where generates the rows and cells): <?php //defined( '_JEXEC' ) or die( 'Restricted access' ); /** Error reporting */ error_reporting(E_ALL); date_default_timezone_set('Europe/London'); require_once 'Classes/PHPExcel.php'; // Create new PHPExcel object $objPHPExcel = new PHPExcel(); // Set properties $objPHPExcel->getProperties()->setCreator("Maarten Balliauw") ->setLastModifiedBy("Maarten Balliauw") ->setTitle("Office 2007 XLSX Test Document") ->setSubject("Office 2007 XLSX Test Document") ->setDescription("Test document for Office 2007 XLSX, generated using PHP classes.") ->setKeywords("office 2007 openxml php") ->setCategory("Test result file"); // Rename sheet $objPHPExcel->getActiveSheet()->setTitle('Reporte de Importaciones'); // Set active sheet index to the first sheet, so Excel opens this as the first sheet $objPHPExcel->setActiveSheetIndex(0); // Redirect output to a client’s web browser (Excel5) header('Content-Type: application/vnd.ms-excel'); header('Content-Disposition: attachment;filename="repPrueba.xls"'); header('Cache-Control: max-age=0'); $objWriter = PHPExcel_IOFactory::createWriter($objPHPExcel, 'Excel5'); $objWriter->save('php://output'); exit;

    Read the article

  • get renamed file names of multiple upload form [js array] in codeigniter

    - by artmania
    Hi friends, I use codeigniter. I have a multiple image upload form. The code below is working well for uploading, but I also need to save file names to database. How can I get the names in here? I spent hours & hours :/ but could not sort it :/ Appreciate helps!!! uploadform.php echo form_open_multipart('gallery/upload'); <input type="file" name="photo" size="50" /> <input type="file" name="thumb" size="50" /> <input type="submit" value="Upload" /> </form> I have a controller between form view and model load model (of course : )) but didnt post here because of no need. gallery_model.php function multiple_upload($upload_dir = 'uploads/', $config = array()) { /* Upload */ $CI =& get_instance(); $files = array(); if(empty($config)) { $config['upload_path'] = realpath($upload_dir); $config['allowed_types'] = 'gif|jpg|jpeg|jpe|png'; $config['max_size'] = '2048'; } $CI->load->library('upload', $config); $errors = FALSE; foreach($_FILES as $key => $value) { if( ! empty($value['name'])) { if( ! $CI->upload->do_upload($key)) { $data['upload_message'] = $CI->upload->display_errors(ERR_OPEN, ERR_CLOSE); // ERR_OPEN and ERR_CLOSE are error delimiters defined in a config file $CI->load->vars($data); $errors = TRUE; } else { // Build a file array from all uploaded files $files[] = $CI->upload->data(); } } } // There was errors, we have to delete the uploaded files if($errors) { foreach($files as $key => $file) { @unlink($file['full_path']); } } elseif(empty($files) AND empty($data['upload_message'])) { $CI->lang->load('upload'); $data['upload_message'] = ERR_OPEN.$CI->lang->line('upload_no_file_selected').ERR_CLOSE; $CI->load->vars($data); } else { return $files; } /* ------------------------------- Insert to database */ // problem is here, i need file names to add db. // if there is already same names file at the folder, it rename file itself. so in such case, I need renamed file name :/ } }

    Read the article

  • List of checkboxes

    - by Andrea Girardi
    Hi all, Happy New Year to all. I'm a newbie in VB.NET and ASP.NET. This is my problem: I retrieve a list of records from DB and, for every row, I need to show 4 checkboxes. I can use a checkboxlist for every rows, but it's not so clear how I can process the results after the submit. I've some object and some operations available for that object. From database I extract a list of object with all operations. For every operation I want to show a check box to enable or disable the operation. The result is something like that: OBJ1 - url - [] [x] [] OBJ2 - url - [] [x] [x] On url I've an href to another page created using the Id retrieved from DB. To create that I used this code: <td class="column-filename"> <strong> <asp:Label runat="server" Text='<%#DataBinder.Eval(Container.DataItem, "GroupName")%>'></asp:Label> </strong> </td> <td align="left"> <span style="vertical-align:middle"> <asp:CheckBoxList runat="server" ID="operations" RepeatDirection="Horizontal" RepeatLayout="Table"> <asp:ListItem Text="View"></asp:ListItem> <asp:ListItem Text="Upload"></asp:ListItem> <asp:ListItem Text="Move"></asp:ListItem> <asp:ListItem Text="Delete"></asp:ListItem> <asp:ListItem Text="Rename"></asp:ListItem> <asp:ListItem Text="Replace"></asp:ListItem> </asp:CheckBoxList> </span> </td> </asp> </asp> my problem is: how can I parse all checkboxes? could you help me or send me a link or any other resources to solve my issue? many thanks! Andrea

    Read the article

  • DLL configuration file in asp.net site

    - by Tominator
    Hi, I've made a .net 2.0 librabry project, that results in a dll. I've made an app.config file in my project, with settings used in the dll, with the intention that they can be changed later. I'm attempting to use the dll in an asp.net web application now, so I made the reference to my other project's output, and I see that the dll is copied over to the site's bin folder, and everything works. However, the configuration file is not copied. When I manually copy the app.config and rename it to myDll.config, it has no influence. The contents of the config file is approximately this: <?xml version="1.0" encoding="utf-8" ?> <configuration> <configSections> <sectionGroup name="applicationSettings" type="System.Configuration.ApplicationSettingsGroup, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" > <section name="myDLL.My.MySettings" type="System.Configuration.ClientSettingsSection, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" requirePermission="false" /> </sectionGroup> </configSections> <applicationSettings> <myDLL.My.MySettings> <setting name="myDLL_webservice_Service" serializeAs="String"> <value>https://myhost/Service.asmx</value> </setting> <setting name="ID" serializeAs="String"> <value>6</value> </setting> </myDLL.My.MySettings> </applicationSettings> </configuration> And I use its settings in the dll with this (vb.net code): Private _id As Long = My.Settings.ID How can I put my config information somewhere so it can be used? In the web.config of the site application? That has only the appSettings section, and it uses the syntax. It doesn't appear to work though. In a custom file format that I create and use? Not that pretty..

    Read the article

  • Is there a way to efficiently yield every file in a directory containing millions of files?

    - by Josh Smeaton
    I'm aware of os.listdir, but as far as I can gather, that gets all the filenames in a directory into memory, and then returns the list. What I want, is a way to yield a filename, work on it, and then yield the next one, without reading them all into memory. Is there any way to do this? I worry about the case where filenames change, new files are added, and files are deleted using such a method. Some iterators prevent you from modifying the collection during iteration, essentially by taking a snapshot of the state of the collection at the beginning, and comparing that state on each move operation. If there is an iterator capable of yielding filenames from a path, does it raise an error if there are filesystem changes (add, remove, rename files within the iterated directory) which modify the collection? There could potentially be a few cases that could cause the iterator to fail, and it all depends on how the iterator maintains state. Using S.Lotts example: filea.txt fileb.txt filec.txt Iterator yields filea.txt. During processing, filea.txt is renamed to filey.txt and fileb.txt is renamed to filez.txt. When the iterator attempts to get the next file, if it were to use the filename filea.txt to find it's current position in order to find the next file and filea.txt is not there, what would happen? It may not be able to recover it's position in the collection. Similarly, if the iterator were to fetch fileb.txt when yielding filea.txt, it could look up the position of fileb.txt, fail, and produce an error. If the iterator instead was able to somehow maintain an index dir.get_file(0), then maintaining positional state would not be affected, but some files could be missed, as their indexes could be moved to an index 'behind' the iterator. This is all theoretical of course, since there appears to be no built-in (python) way of iterating over the files in a directory. There are some great answers below, however, that solve the problem by using queues and notifications. Edit: The OS of concern is Redhat. My use case is this: Process A is continuously writing files to a storage location. Process B (the one I'm writing), will be iterating over these files, doing some processing based on the filename, and moving the files to another location. Edit: Definition of valid: Adjective 1. Well grounded or justifiable, pertinent. (Sorry S.Lott, I couldn't resist). I've edited the paragraph in question above.

    Read the article

  • FTP script needs blank line

    - by Ones and Zeroes
    I am trying to determine the reason for some FTP servers requiring a blank line in the script as follows: open server.com username ftp_commands bye Refer to blank line required after username credentials. Example from: FTP from batch file another reference to the same: http://newsgroups.derkeiler.com/Archive/Comp/comp.sys.ibm.as400.misc/2008-05/msg00227.html Also discussed here: archive.midrange.com/midrange-l/200601/msg00048.html "The behavior I'm observing is the same as if I didn't specify the password to login." with an answer referring to our same fix... archive.midrange.com/midrange-l/200601/msg00053.html and archive.midrange.com/midrange-l/200601/msg00065.html Note: It is my experience that FTP questions attract uncouth responses. Admittedly FTP is outdated, but many clients still have legacy systems, which they cannot upgrade or replace. The reason thereof should not be discussed here. The intention of this question is to invite a positive response. Please do not respond if you disagree with the above. If you have never encountered this same issue, please do not respond. I suspect this may be limited to FTP scripts executed from Windows machines, but have been told that this happens often and with many different servers. My specific interest is to understand what may cause this as I have a real world example of a production system suddenly requiring this as a workaround fix, after running for many years without issue. The server belongs to a third party who claims no change on their end. Server details unknown and cannot be determined. Any help or encouragement from someone who has come across the same, would be appreciated. ps. Sorry for the many words and references to painful responses, but I have asked similar questions on serverfault and elsewhere and unfortunately got back kneejerk responses to FTP and respondents debating the validity of the question. I would truly not ask, or re-post this question online if I had a better understanding of the issue. I know of people who have seen this issue, but don't know what causes it. I am wary that this question would again turn into another irrelevant discussion. Please, I ask very nicely: Please do not respond if you have not encountered a similar issue. FURTHER EDIT: Please do not suggest changing the product. The problem is not the blank line requirement. We know this fixes the issue. The problem is not being able to explain the reason for the blank line in the first place. Slight difference, but a critical point to note wrt the answering of this question.

    Read the article

  • Binding one dependency property to another

    - by Gregory Dodd
    I have a custom Tab Control that I have created, but I am having an issue. I have an Editable TextBox as part of the custom TabControl View. <Controls:EditableTextControl x:Name="PageTypeName" Style="{StaticResource ResourceKey={x:Type Controls:EditableTextControl}}" Grid.Row="0" TabIndex="0" Uid="0" AutomationProperties.AutomationId="PageTypeNameTextBox" AutomationProperties.Name="PageTypeName" Visibility="{Binding ElementName=PageTabControl,Path=ShowPageType}"> <Controls:EditableTextControl.ContextMenu> <ContextMenu x:Name="TabContextMenu"> <MenuItem Header="Rename Page Type" Command="{Binding Path=PlacementTarget.EnterEditMode, RelativeSource={RelativeSource AncestorType=ContextMenu}}" AutomationProperties.AutomationId="RenamePageTypeMenuItem" AutomationProperties.Name="RenamePageType"/> <MenuItem Header="Delete Page Type" Command="{Binding Path=PageTypeDeletedCommand}" AutomationProperties.AutomationId="DeletePageTypeMenuItem" AutomationProperties.Name="DeletePageType"/> </ContextMenu> </Controls:EditableTextControl.ContextMenu> <Controls:EditableTextControl.Content> <!--<Binding Path="CurrentPageTypeViewModel.Name" Mode="TwoWay"/>--> <Binding ElementName="PageTabControl" Path="CurrentPageTypeName" Mode ="TwoWay"/> </Controls:EditableTextControl.Content> </Controls:EditableTextControl> In the Content section I am binding to a Dependency Prop called CurrentPageTypeName. This Depedency prop is part of this custom Tab Control. public static DependencyProperty CurrentPageTypeNameProperty = DependencyProperty.Register("CurrentPageTypeName", typeof(object), typeof(TabControlView)); public object CurrentPageTypeName { get { return GetValue(CurrentPageTypeNameProperty) as object; } set { SetValue(CurrentPageTypeNameProperty, value); } } In another view, where I am using the custom TabControl I then bind my property, with the actual name value, to CurrentPageTypeName property as seen below: <Views:TabControlView Grid.Row="0" Name="RunPageTabControl" TabItemsSource="{Binding RunPageTypeViewModels}" SelectedTab="{Binding Converter={StaticResource debugConverter}}" CurrentPageTypeName="{Binding Path=RunPageName, Mode=TwoWay}" TabContentTemplateSelector="{StaticResource tabItemTemplateSelector}" SelectedIndex="{Binding RelativeSource={RelativeSource FindAncestor, AncestorType={x:Type UserControl}}, Path=DataContext.SelectedTabIndex}" ShowPageType="Hidden" > <!--<Views:TabControlView.TabContentTemplate> <DataTemplate DataType="{x:Type ViewModels:RunPageTypeViewModel}"> <RunViews:RunPageTypeView/> </DataTemplate> </Views:TabControlView.TabContentTemplate>--> </Views:TabControlView> My problem is that nothing seems to be happening. It is grabbing its Content from the Itemsource, and not from my chained Dependency props. Is what I am trying even possible? If so, what have I done wrong. Thanks for looking.

    Read the article

  • Select Elements in nested Divs using JQuery

    - by PIKP
    I have the following html markup inside a Div named item and I want to select all the elements (inside nested divs) and clear the values. As shown in following given Jquery I have managed to access elements in each Div by using.children().each(). But the the problem is .children().each()goes one level down at a time from the parent div, so I have repeated the same code block with multiple .children() to access the elements inside nested Divs, can anyone suggest me a method to do this without repeating the code for N number of nested divs . html markup <div class="item"> <input type="hidden" value="1234" name="testVal"> <div class="form-group" id="c1"> <div class="controls "> <input type="text" value="Results" name="s1" maxlength="255" id="id2"> </div> </div> <div class="form-group" id="id4"> <input type="text" value="Results" name="s12" maxlength="255" id="id225"> <div class="form-group" id="id41"> <input type="text" value="Results" name="s12" maxlength="255" id="5"> <div class="form-group" id="id42"> <input type="text" value="Results" name="s12" maxlength="255" id="5"> <div class="form-group" id="id43"> <input type="text" value="Results" name="s12" maxlength="255" id="id224"> </div> </div> </div> </div> </div> My Qjuery script var row = $(".item:first").clone(false).get(0); $(row).children().each(function () { updateElementIndex(this, prefix, formCount); if ($(this).attr('type') == 'text') { $(this).val(''); } if ($(this).attr('type') == 'hidden' && ($(this).attr('name') != 'csrfmiddlewaretoken')) { $(this).val(''); } if ($(this).attr('type') == 'file') { $(this).val(''); } if ($(this).attr('type') == 'checkbox') { $(this).attr('checked', false); } $(this).remove('a'); }); // Relabel or rename all the relevant bits $(row).children().children().each(function () { updateElementIndex(this, prefix, formCount) if ($(this).attr('type') == 'text') { $(this).val(''); } if ($(this).attr('type') == 'hidden' && ($(this).attr('name') != 'csrfmiddlewaretoken')) { $(this).val(''); } if ($(this).attr('type') == 'file') { $(this).val(''); } if ($(this).attr('type') == 'checkbox') { $(this).attr('checked', false); } $(this).remove('a'); });

    Read the article

  • Sharepoint Error: [COMException (0x80004005): Cannot complete this action.

    - by ifunky
    Hi, I've created a site basic definition that uses different master and default pages. Everything works quite well except for whenever I create a new site based on the definition I receive the following error when browsing to the new site: [COMException (0x80004005): Cannot complete this action. Please try again.] Please try again.] Microsoft.SharePoint.Library.SPRequestInternalClass.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, Byte& pVerGhostedSetupPath, UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocId, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder) +0 Microsoft.SharePoint.Library.SPRequest.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, Byte& pVerGhostedSetupPath, UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocId, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder) +219 [SPException: Cannot complete this action. Please try again.] I'm able to work around this by checking out the new master page and checking it back in again and after doing so there are no further issues at all. Any ideas to what could cause this? Thanks Dan ONET.XML module section: <Modules> <Module Name="CustomMasterPage" List="116" Url="_catalogs/masterpage" RootWebOnly="FALSE"> <File Url="Shoes.master" Type="GhostableInLibrary" IgnoreIfAlreadyExists="TRUE" /> </Module> <Module Name="Default" List="116" Url=""> <File Url="default.aspx" Name="default.aspx" NavBarHome="True" IgnoreIfAlreadyExists="FALSE"> <AllUsersWebPart WebPartZoneID="Left" WebPartOrder="1"> &lt;webParts&gt;&lt;webPart xmlns="http://schemas.microsoft.com/WebPart/v3"&gt;&lt;metaData&gt;&lt;type name="BCM.SharePoint.Shoes.ShoesComponents.FooterLinks, BCM.SharePoint.Shoes.ShoesComponents, Version=1.0.0.0, Culture=neutral, PublicKeyToken=2881713f39360b71" /&gt;&lt;importErrorMessage&gt;Cannot import this Web Part.&lt;/importErrorMessage&gt;&lt;/metaData&gt;&lt;data&gt;&lt;properties&gt;&lt;property name="AllowClose" type="bool"&gt;True&lt;/property&gt;&lt;property name="Width" type="string" /&gt;&lt;property name="MyProperty" type="string"&gt;Hello SharePoint&lt;/property&gt;&lt;property name="AllowMinimize" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowConnect" type="bool"&gt;True&lt;/property&gt;&lt;property name="ChromeType" type="chrometype"&gt;None&lt;/property&gt;&lt;property name="TitleIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_FooterLinks.gif&lt;/property&gt;&lt;property name="Description" type="string"&gt;FooterLinks Description&lt;/property&gt;&lt;property name="Hidden" type="bool"&gt;False&lt;/property&gt;&lt;property name="TitleUrl" type="string" /&gt;&lt;property name="AllowEdit" type="bool"&gt;True&lt;/property&gt;&lt;property name="Height" type="string" /&gt;&lt;property name="MissingAssembly" type="string"&gt;Cannot import this Web Part.&lt;/property&gt;&lt;property name="HelpUrl" type="string" /&gt;&lt;property name="Title" type="string" /&gt;&lt;property name="CatalogIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_FooterLinks.gif&lt;/property&gt;&lt;property name="Direction" type="direction"&gt;NotSet&lt;/property&gt;&lt;property name="ChromeState" type="chromestate"&gt;Normal&lt;/property&gt;&lt;property name="AllowZoneChange" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowHide" type="bool"&gt;True&lt;/property&gt;&lt;property name="HelpMode" type="helpmode"&gt;Modeless&lt;/property&gt;&lt;property name="ExportMode" type="exportmode"&gt;All&lt;/property&gt;&lt;/properties&gt;&lt;/data&gt;&lt;/webPart&gt;&lt;/webParts&gt; </AllUsersWebPart> <AllUsersWebPart WebPartZoneID="Left" WebPartOrder="0"> &lt;webParts&gt;&lt;webPart xmlns="http://schemas.microsoft.com/WebPart/v3"&gt;&lt;metaData&gt;&lt;type name="BCM.SharePoint.Shoes.ShoesComponents.SubFooterLinks, BCM.SharePoint.Shoes.ShoesComponents, Version=1.0.0.0, Culture=neutral, PublicKeyToken=2881713f39360b71" /&gt;&lt;importErrorMessage&gt;Cannot import this Web Part.&lt;/importErrorMessage&gt;&lt;/metaData&gt;&lt;data&gt;&lt;properties&gt;&lt;property name="AllowClose" type="bool"&gt;True&lt;/property&gt;&lt;property name="Width" type="string" /&gt;&lt;property name="MyProperty" type="string"&gt;Hello SharePoint&lt;/property&gt;&lt;property name="AllowMinimize" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowConnect" type="bool"&gt;True&lt;/property&gt;&lt;property name="ChromeType" type="chrometype"&gt;None&lt;/property&gt;&lt;property name="TitleIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_SubFooterLinks.gif&lt;/property&gt;&lt;property name="Description" type="string"&gt;Shoes home page links (under the hero image)&lt;/property&gt;&lt;property name="Hidden" type="bool"&gt;False&lt;/property&gt;&lt;property name="TitleUrl" type="string" /&gt;&lt;property name="AllowEdit" type="bool"&gt;True&lt;/property&gt;&lt;property name="Height" type="string" /&gt;&lt;property name="MissingAssembly" type="string"&gt;Cannot import this Web Part.&lt;/property&gt;&lt;property name="HelpUrl" type="string" /&gt;&lt;property name="Title" type="string" /&gt;&lt;property name="CatalogIconImageUrl" type="string"&gt;/_layouts/images/BCM_SharePoint_Shoes/wp_SubFooterLinks.gif&lt;/property&gt;&lt;property name="Direction" type="direction"&gt;NotSet&lt;/property&gt;&lt;property name="ChromeState" type="chromestate"&gt;Normal&lt;/property&gt;&lt;property name="AllowZoneChange" type="bool"&gt;True&lt;/property&gt;&lt;property name="AllowHide" type="bool"&gt;True&lt;/property&gt;&lt;property name="HelpMode" type="helpmode"&gt;Modeless&lt;/property&gt;&lt;property name="ExportMode" type="exportmode"&gt;All&lt;/property&gt;&lt;/properties&gt;&lt;/data&gt;&lt;/webPart&gt;&lt;/webParts&gt; </AllUsersWebPart> <NavBarPage Name="$Resources:core,nav_Home;" ID="1002" Position="Start" /> <NavBarPage Name="$Resources:core,nav_Home;" ID="0" Position="Start" /> </File> </Module> </Modules> LOG OUTPUT: 05/21/2010 12:22:55.11 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72nz Medium Videntityinfo::isFreshToken reported failure. 05/21/2010 12:22:55.19 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yv Medium Creating default lists 05/21/2010 12:22:55.19 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72lp Medium Creating directory Lists 05/21/2010 12:22:55.26 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yf Medium Creating list "Master Page Gallery" in web "http://mmm-dev-ll/sites/Shoes/test" at URL "_catalogs/masterpage", (setuppath: "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\Template\global\lists\mplib") 05/21/2010 12:22:55.28 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88y1 Medium No document templates uploaded for list "Master Page Gallery" -- none found for list template "100". 05/21/2010 12:22:55.28 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72kc Medium Failed to find generic XML file at "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\Template\xml\onet.xml", falling back to global site definition. 05/21/2010 12:22:56.22 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yz Medium Creating default modules at URL "http://mmm-dev-ll/sites/Shoes/test" 05/21/2010 12:22:56.22 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8e27 Medium Ensuring module folder _catalogs/masterpage 05/21/2010 12:22:56.89 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72h7 Medium Applying template "SubSite#1" to web at URL "http://mmm-dev-ll/sites/Shoes/test". 05/21/2010 12:22:57.09 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yy Medium Activating web-scoped features for template "SubSite#1" at URL "http://mmm-dev-ll/sites/Shoes/test" 05/21/2010 12:22:57.12 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1c Medium Preparing 20 features for activation 05/21/2010 12:22:57.14 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1d Medium Feature Activation: Batch Activating Features at URL http://mmm-dev-ll/sites/Shoes/test 'AnnouncementsList' (ID: '00bfea71-d1ce-42de-9c63-a44004ce0104'), 'ContactsList' (ID: '00bfea71-7e6d-4186-9ba8-c047ac750105'), 'CustomList' (ID: '00bfea71-de22-43b2-a848-c05709900100'), 'DataSourceLibrary' (ID: '00bfea71-f381-423d-b9d1-da7a54c50110'), 'DiscussionsList' (ID: '00bfea71-6a49-43fa-b535-d15c05500108'), 'DocumentLibrary' (ID: '00bfea71-e717-4e80-aa17-d0c71b360101'), 'EventsList' (ID: '00bfea71-ec85-4903-972d-ebe475780106'), 'GanttTasksList' (ID: '00bfea71-513d-4ca0-96c2-6a47775c0119'), 'GridList' (ID: '00bfea71-3a1d-41d3-a0ee-651d11570120'), 'IssuesList' (ID: '00bfea71-5932-4f9c-ad71-1557e5751100'), 'LinksList' (ID: '00bfea71-2062-426c-90bf-714c59600103'), 'NoCodeWorkflowLibrary' (ID: '00bfe... 05/21/2010 12:22:57.14* w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1d Medium ...a71-f600-43f6-a895-40c0de7b0117'), 'PictureLibrary' (ID: '00bfea71-52d4-45b3-b544-b1c71b620109'), 'SurveysList' (ID: '00bfea71-eb8a-40b1-80c7-506be7590102'), 'TasksList' (ID: '00bfea71-a83e-497e-9ba0-7a5c597d0107'), 'WebPageLibrary' (ID: '00bfea71-c796-4402-9f2f-0eb9a6e71b18'), 'workflowProcessList' (ID: '00bfea71-2d77-4a75-9fca-76516689e21a'), 'WorkflowHistoryList' (ID: '00bfea71-4ea5-48d4-a4ad-305cf7030140'), 'XmlFormLibrary' (ID: '00bfea71-1e1d-4562-b56a-f05371bb0115'), 'TeamCollab' (ID: '00bfea71-4ea5-48d4-a4ad-7ea5c011abe5'), . 05/21/2010 12:22:57.15 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1f Medium Feature Activation: Batch Activated Features at URL http://mmm-dev-ll/sites/Shoes/test 'AnnouncementsList' (ID: '00bfea71-d1ce-42de-9c63-a44004ce0104'), 'ContactsList' (ID: '00bfea71-7e6d-4186-9ba8-c047ac750105'), 'CustomList' (ID: '00bfea71-de22-43b2-a848-c05709900100'), 'DataSourceLibrary' (ID: '00bfea71-f381-423d-b9d1-da7a54c50110'), 'DiscussionsList' (ID: '00bfea71-6a49-43fa-b535-d15c05500108'), 'DocumentLibrary' (ID: '00bfea71-e717-4e80-aa17-d0c71b360101'), 'EventsList' (ID: '00bfea71-ec85-4903-972d-ebe475780106'), 'GanttTasksList' (ID: '00bfea71-513d-4ca0-96c2-6a47775c0119'), 'GridList' (ID: '00bfea71-3a1d-41d3-a0ee-651d11570120'), 'IssuesList' (ID: '00bfea71-5932-4f9c-ad71-1557e5751100'), 'LinksList' (ID: '00bfea71-2062-426c-90bf-714c59600103'), 'NoCodeWorkflowLibrary' (ID: '00bfea... 05/21/2010 12:22:57.15* w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8l1f Medium ...71-f600-43f6-a895-40c0de7b0117'), 'PictureLibrary' (ID: '00bfea71-52d4-45b3-b544-b1c71b620109'), 'SurveysList' (ID: '00bfea71-eb8a-40b1-80c7-506be7590102'), 'TasksList' (ID: '00bfea71-a83e-497e-9ba0-7a5c597d0107'), 'WebPageLibrary' (ID: '00bfea71-c796-4402-9f2f-0eb9a6e71b18'), 'workflowProcessList' (ID: '00bfea71-2d77-4a75-9fca-76516689e21a'), 'WorkflowHistoryList' (ID: '00bfea71-4ea5-48d4-a4ad-305cf7030140'), 'XmlFormLibrary' (ID: '00bfea71-1e1d-4562-b56a-f05371bb0115'), 'TeamCollab' (ID: '00bfea71-4ea5-48d4-a4ad-7ea5c011abe5'), . 05/21/2010 12:22:57.15 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 88jb Medium Feature Activation: Activating Feature 'RadEditorFeatureRichText' (ID: '747755cd-d060-4663-961c-9b0cc43724e9') at URL http://mmm-dev-ll/sites/Shoes/test. 05/21/2010 12:22:57.15 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 75fb Medium Calling 'FeatureActivated' method of SPFeatureReceiver for Feature 'RadEditorFeatureRichText' (ID: '747755cd-d060-4663-961c-9b0cc43724e9'). 05/21/2010 12:22:57.20 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 75f8 Medium Feature Activation: Feature 'RadEditorFeatureRichText' (ID: '747755cd-d060-4663-961c-9b0cc43724e9') was activated at URL http://mmm-dev-ll/sites/Shoes/test. 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yv Medium Creating default lists 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72lp Medium Creating directory Lists 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services Fields 88yz Medium Creating default modules at URL "http://mmm-dev-ll/sites/Shoes/test" 05/21/2010 12:22:57.51 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 8e27 Medium Ensuring module folder _catalogs/masterpage 05/21/2010 12:22:57.56 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72ix Medium Not enough information to determine a list for module "Default". Assuming no list for this module. 05/21/2010 12:22:57.87 w3wp.exe (0x1E40) 0x18A4 Windows SharePoint Services General 72h8 Medium Successfully applied template "SubSite#1" to web at URL "http://mmm-dev-ll/sites/Shoes/test". 05/21/2010 12:22:59.48 w3wp.exe (0x1E40) 0x0980 Windows SharePoint Services General 8e2s Medium Unknown SPRequest error occurred. More information: 0x80070057 05/21/2010 12:23:07.06 OWSTIMER.EXE (0x0884) 0x106C Office Server Setup and Upgrade 8u3j High Registry key value {SearchThrottled} was not found under registry hive {Software\Microsoft\Office Server\12.0}. Assuming search sku is not throttled. 05/21/2010 12:23:07.08 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 90gf Medium SQL: dbo.proc_MSS_PropagationGetQueryServers 05/21/2010 12:23:07.09 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 8wni High Resuming default catalog with reason 'GPR_PROPAGATION' for application 'SharedServices1'... 05/21/2010 12:23:07.11 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 8wnj High Resuming anchor text catalog with reason GPR_PROPAGATION' for application 'SharedServices1'... 05/21/2010 12:23:07.14 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 8dvl Medium Search application '3c6751cc-37b0-470a-bfa2-bfd0b5635fe1': Provision start addresses in default content source. 05/21/2010 12:23:07.15 OWSTIMER.EXE (0x0884) 0x106C Search Server Common MS Search Administration 7hmh High exception in SearchUpgradeProvisioner Keyword Config System.InvalidOperationException: jobServerSearchServiceInstance is null at Microsoft.Office.Server.Search.Administration.SearchUpgradeProvisioner..ctor(SearchServiceInstance searchServiceInstance) at Microsoft.Office.Server.Search.Administration.OSSPrimaryGathererProject.ProvisionContentSources() 05/21/2010 12:23:29.19 OWSTIMER.EXE (0x0884) 0x0FFC SharePoint Portal Server Business Data 79bv High Initiating BDC Cache Invalidation Check in AppDomain 'DefaultDomain' 05/21/2010 12:23:29.19 OWSTIMER.EXE (0x0884) 0x0FFC SharePoint Portal Server Business Data 79bx High Completed BDC Cache Invalidation Check in AppDomain 'DefaultDomain' 05/21/2010 12:23:45.84 OWSTIMER.EXE (0x0884) 0x08A8 SharePoint Portal Server SSO 8inc Medium In SSOService::Synch(), sso database conn string: 05/21/2010 12:23:50.80 OWSTIMER.EXE (0x0884) 0x0F14 Excel Services Excel Services Administration 8tqi Medium ExcelServerSharedWebApplication.Synchronize: Starting synchronize for instance of Excel Services in SSP 'SharedServices1'. 05/21/2010 12:23:50.80 OWSTIMER.EXE (0x0884) 0x0F14 Excel Services Excel Services Administration 8tqj Medium ExcelServerSharedWebApplication.Synchronize: Successfully synchronized instance of Excel Services in SSP 'SharedServices1'. 05/21/2010 12:23:52.31 w3wp.exe (0x1E40) 0x0980 SharePoint Portal Server Runtime 8gp7 Medium Topology cache updated. (AppDomain: /LM/W3SVC/1963195510/Root-1-129188762904047141)

    Read the article

  • Simple Self Join Query Bad Performance

    - by user1514042
    Could anyone advice on how do I improve the performance of the following query. Note, the problem seems to be caused by where clause. Data (table contains a huge set of rows - 500K+, the set of parameters it's called with assums the return of 2-5K records per query, which takes 8-10 minutes currently): USE [SomeDb] GO SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Data]( [x] [money] NOT NULL, [y] [money] NOT NULL, CONSTRAINT [PK_Data] PRIMARY KEY CLUSTERED ( [x] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO The Query select top 10000 s.x as sx, e.x as ex, s.y as sy, e.y as ey, e.y - s.y as y_delta, e.x - s.x as x_delta from Data s inner join Data e on e.x > s.x and e.x - s.x between xFrom and xTo --where e.y - s.y > @yDelta -- when uncommented causes a huge delay Update 1 - Execution Plan <?xml version="1.0" encoding="utf-16"?> <ShowPlanXML xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" Version="1.2" Build="11.0.2100.60" xmlns="http://schemas.microsoft.com/sqlserver/2004/07/showplan"> <BatchSequence> <Batch> <Statements> <StmtSimple StatementCompId="1" StatementEstRows="100" StatementId="1" StatementOptmLevel="FULL" StatementOptmEarlyAbortReason="GoodEnoughPlanFound" StatementSubTreeCost="0.0263655" StatementText="select top 100&#xD;&#xA;s.x as sx,&#xD;&#xA;e.x as ex,&#xD;&#xA;s.y as sy,&#xD;&#xA;e.y as ey,&#xD;&#xA;e.y - s.y as y_delta,&#xD;&#xA;e.x - s.x as x_delta&#xD;&#xA;from Data s &#xD;&#xA; inner join Data e&#xD;&#xA; on e.x &gt; s.x and e.x - s.x between 100 and 105&#xD;&#xA;where e.y - s.y &gt; 0.01&#xD;&#xA;" StatementType="SELECT" QueryHash="0xAAAC02AC2D78CB56" QueryPlanHash="0x747994153CB2D637" RetrievedFromCache="true"> <StatementSetOptions ANSI_NULLS="true" ANSI_PADDING="true" ANSI_WARNINGS="true" ARITHABORT="true" CONCAT_NULL_YIELDS_NULL="true" NUMERIC_ROUNDABORT="false" QUOTED_IDENTIFIER="true" /> <QueryPlan DegreeOfParallelism="0" NonParallelPlanReason="NoParallelPlansInDesktopOrExpressEdition" CachedPlanSize="24" CompileTime="13" CompileCPU="13" CompileMemory="424"> <MemoryGrantInfo SerialRequiredMemory="0" SerialDesiredMemory="0" /> <OptimizerHardwareDependentProperties EstimatedAvailableMemoryGrant="52199" EstimatedPagesCached="14561" EstimatedAvailableDegreeOfParallelism="4" /> <RelOp AvgRowSize="55" EstimateCPU="1E-05" EstimateIO="0" EstimateRebinds="0" EstimateRewinds="0" EstimatedExecutionMode="Row" EstimateRows="100" LogicalOp="Compute Scalar" NodeId="0" Parallel="false" PhysicalOp="Compute Scalar" EstimatedTotalSubtreeCost="0.0263655"> <OutputList> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> <ColumnReference Column="Expr1004" /> <ColumnReference Column="Expr1005" /> </OutputList> <ComputeScalar> <DefinedValues> <DefinedValue> <ColumnReference Column="Expr1004" /> <ScalarOperator ScalarString="[SomeDb].[dbo].[Data].[y] as [e].[y]-[SomeDb].[dbo].[Data].[y] as [s].[y]"> <Arithmetic Operation="SUB"> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </Identifier> </ScalarOperator> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> </Identifier> </ScalarOperator> </Arithmetic> </ScalarOperator> </DefinedValue> <DefinedValue> <ColumnReference Column="Expr1005" /> <ScalarOperator ScalarString="[SomeDb].[dbo].[Data].[x] as [e].[x]-[SomeDb].[dbo].[Data].[x] as [s].[x]"> <Arithmetic Operation="SUB"> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> </Identifier> </ScalarOperator> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> </Identifier> </ScalarOperator> </Arithmetic> </ScalarOperator> </DefinedValue> </DefinedValues> <RelOp AvgRowSize="39" EstimateCPU="1E-05" EstimateIO="0" EstimateRebinds="0" EstimateRewinds="0" EstimatedExecutionMode="Row" EstimateRows="100" LogicalOp="Top" NodeId="1" Parallel="false" PhysicalOp="Top" EstimatedTotalSubtreeCost="0.0263555"> <OutputList> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </OutputList> <RunTimeInformation> <RunTimeCountersPerThread Thread="0" ActualRows="100" ActualEndOfScans="1" ActualExecutions="1" /> </RunTimeInformation> <Top RowCount="false" IsPercent="false" WithTies="false"> <TopExpression> <ScalarOperator ScalarString="(100)"> <Const ConstValue="(100)" /> </ScalarOperator> </TopExpression> <RelOp AvgRowSize="39" EstimateCPU="151828" EstimateIO="0" EstimateRebinds="0" EstimateRewinds="0" EstimatedExecutionMode="Row" EstimateRows="100" LogicalOp="Inner Join" NodeId="2" Parallel="false" PhysicalOp="Nested Loops" EstimatedTotalSubtreeCost="0.0263455"> <OutputList> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </OutputList> <RunTimeInformation> <RunTimeCountersPerThread Thread="0" ActualRows="100" ActualEndOfScans="0" ActualExecutions="1" /> </RunTimeInformation> <NestedLoops Optimized="false"> <OuterReferences> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </OuterReferences> <RelOp AvgRowSize="23" EstimateCPU="1.80448" EstimateIO="3.76461" EstimateRebinds="0" EstimateRewinds="0" EstimatedExecutionMode="Row" EstimateRows="1" LogicalOp="Clustered Index Scan" NodeId="3" Parallel="false" PhysicalOp="Clustered Index Scan" EstimatedTotalSubtreeCost="0.0032831" TableCardinality="1640290"> <OutputList> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </OutputList> <RunTimeInformation> <RunTimeCountersPerThread Thread="0" ActualRows="15225" ActualEndOfScans="0" ActualExecutions="1" /> </RunTimeInformation> <IndexScan Ordered="false" ForcedIndex="false" ForceScan="false" NoExpandHint="false"> <DefinedValues> <DefinedValue> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> </DefinedValue> <DefinedValue> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </DefinedValue> </DefinedValues> <Object Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Index="[PK_Data]" Alias="[e]" IndexKind="Clustered" /> </IndexScan> </RelOp> <RelOp AvgRowSize="23" EstimateCPU="0.902317" EstimateIO="1.88387" EstimateRebinds="1" EstimateRewinds="0" EstimatedExecutionMode="Row" EstimateRows="100" LogicalOp="Clustered Index Seek" NodeId="4" Parallel="false" PhysicalOp="Clustered Index Seek" EstimatedTotalSubtreeCost="0.0263655" TableCardinality="1640290"> <OutputList> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> </OutputList> <RunTimeInformation> <RunTimeCountersPerThread Thread="0" ActualRows="100" ActualEndOfScans="15224" ActualExecutions="15225" /> </RunTimeInformation> <IndexScan Ordered="true" ScanDirection="FORWARD" ForcedIndex="false" ForceSeek="false" ForceScan="false" NoExpandHint="false" Storage="RowStore"> <DefinedValues> <DefinedValue> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> </DefinedValue> <DefinedValue> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> </DefinedValue> </DefinedValues> <Object Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Index="[PK_Data]" Alias="[s]" IndexKind="Clustered" /> <SeekPredicates> <SeekPredicateNew> <SeekKeys> <EndRange ScanType="LT"> <RangeColumns> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> </RangeColumns> <RangeExpressions> <ScalarOperator ScalarString="[SomeDb].[dbo].[Data].[x] as [e].[x]"> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> </Identifier> </ScalarOperator> </RangeExpressions> </EndRange> </SeekKeys> </SeekPredicateNew> </SeekPredicates> <Predicate> <ScalarOperator ScalarString="([SomeDb].[dbo].[Data].[x] as [e].[x]-[SomeDb].[dbo].[Data].[x] as [s].[x])&gt;=($100.0000) AND ([SomeDb].[dbo].[Data].[x] as [e].[x]-[SomeDb].[dbo].[Data].[x] as [s].[x])&lt;=($105.0000) AND ([SomeDb].[dbo].[Data].[y] as [e].[y]-[SomeDb].[dbo].[Data].[y] as [s].[y])&gt;(0.01)"> <Logical Operation="AND"> <ScalarOperator> <Compare CompareOp="GE"> <ScalarOperator> <Arithmetic Operation="SUB"> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> </Identifier> </ScalarOperator> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> </Identifier> </ScalarOperator> </Arithmetic> </ScalarOperator> <ScalarOperator> <Const ConstValue="($100.0000)" /> </ScalarOperator> </Compare> </ScalarOperator> <ScalarOperator> <Compare CompareOp="LE"> <ScalarOperator> <Arithmetic Operation="SUB"> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="x" /> </Identifier> </ScalarOperator> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="x" /> </Identifier> </ScalarOperator> </Arithmetic> </ScalarOperator> <ScalarOperator> <Const ConstValue="($105.0000)" /> </ScalarOperator> </Compare> </ScalarOperator> <ScalarOperator> <Compare CompareOp="GT"> <ScalarOperator> <Arithmetic Operation="SUB"> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[e]" Column="y" /> </Identifier> </ScalarOperator> <ScalarOperator> <Identifier> <ColumnReference Database="[SomeDb]" Schema="[dbo]" Table="[Data]" Alias="[s]" Column="y" /> </Identifier> </ScalarOperator> </Arithmetic> </ScalarOperator> <ScalarOperator> <Const ConstValue="(0.01)" /> </ScalarOperator> </Compare> </ScalarOperator> </Logical> </ScalarOperator> </Predicate> </IndexScan> </RelOp> </NestedLoops> </RelOp> </Top> </RelOp> </ComputeScalar> </RelOp> </QueryPlan> </StmtSimple> </Statements> </Batch> </BatchSequence> </ShowPlanXML>

    Read the article

  • Week in Geek: LastPass Rescues Xmarks Edition

    - by Asian Angel
    This week we learned how to breathe new life into an aging Windows Mobile 6.x device, use filters in Photoshop, backup and move VirtualBox machines, use the BitDefender Rescue CD to clean an infected PC, and had fun setting up a pirates theme on our computers. Photo by _nash. Weekly Feature Do you love using the Faenza icon set on your Ubuntu system but feel that there are a few much needed icons missing (or you desire a different version of a particular icon)? Then you may want to take a look at the Faenza Variants icon pack. The icons are available in the following sizes: 16px, 22px, 32px, 48px and scalable sizes. Photo by Asian Angel. Faenza Variants Random Geek Links Another week with extra link goodness to help keep you on top of the news. Photo by Asian Angel. LastPass acquires Xmarks, premium service announced Xmarks announced that it has been acquired by LastPass, a cross-platform password management service. This also means that Xmarks is now in transition from a “free” to a “freemium” business model. WikiLeaks reappears on European Net domains WikiLeaks has re-emerged on a Swiss Internet domain followed by domains in Germany, Finland, and the Netherlands, sidestepping a move that had in effect taken the controversial site off the Internet. Iran: Yes, Stuxnet hurt our nuclear program The Stuxnet worm got some big play from Iranian President Mahmoud Ahmadinejad, who acknowledged that the malware dinged his nuclear program. More Windows Rogues than Just AV – Fake Defragmenter Check Disk Don’t think for a second that rogues are limited to scareware, because as so-called products such as “System Defragmenter”, “Scan Disk” “Check Disk” prove, they’re not. Internet Explorer’s Protected Mode can be bypassed Researchers from Verizon Business have now described a way of bypassing Protected Mode in IE 7 and 8 in order to gain access to user accounts. Can you really see who viewed your Facebook profile? Rogue application spreads virally Once again, a rogue application is spreading virally between Facebook users pretending to offer you a way of seeing who has viewed your profile. More holes in Palm’s WebOS Researchers Orlando Barrera and Daniel Herrera, who both work for security firm SecTheory, have discovered a gaping security hole in Palm’s WebOS smartphone operating system. Next-gen banking Trojans hit APAC With the proliferation of banking Trojans, Web and smartphone users of online banking services have to be on constant alert to avoid falling prey to fraud schemes, warned Etay Maor, project manager for RSA Fraud Action. AVG update cripples 64-bit computers A signature update automatically deployed by the AVG virus scanner Thursday has crippled numerous computers. Article includes link to forums to fix computers affected after a restart. Congress moves to outlaw ‘mystery charges’ for Web shoppers Legislation that makes it illegal for Web merchants and so-called post-transaction marketers to charge credit cards without the card owners’ say-so came closer to becoming law this week. Ballmer Set to “Look Into” Windows Home Server Drive Extender Fiasco Tuesday’s announcement from Microsoft regarding the removal of Drive Extender from Windows Home Server has sent shock waves across the web. Google tweaks search recipe to ding scam artists Google has changed its search algorithm to penalize sites deemed to provide an “extremely poor user experience” following a New York Times story on a merchant who justified abusive behavior towards customers as a search-engine optimization tactic. Geek Video of the Week Watch as our two friends debate back and forth about the early adoption of new technology through multiple time periods (Stone Age to the far future). Will our reluctant friend finally succumb to the temptation? Photo by CollegeHumor. Early Adopters Through History Random TinyHacker Links Fix Issues in Windows 7 Using Reliability Monitor Learn how to analyze Windows 7 errors and then fix them using the built-in reliability monitor. Learn About IE Tab Groups Tab groups is a useful feature in IE 8. Here’s a detailed guide to what it is all about. Google’s Book Helps You Learn About Browsers and Web A cool new online book by the Google Chrome team on browsers and the web. TrustPort Internet Security 2011 – Good Security from a Less Known Provider TrustPort is not exactly a well-known provider of security solutions. At least not in the consumer space. This review tests in detail their latest offering. How the World is Using Cell phones An infographic showing the shocking demographics of cell phone use. Super User Questions See the great answers to these questions from Super User. I am unable to access my C drive. It says it is unable to display current owner. List of Windows special directories/shortcuts like ‘%TEMP%’ Is using multiple passes for wiping a disk really necessary? How can I view two files side by side in Notepad++ Is there any tool that automatically puts screenshots to my Dropbox? How-To Geek Weekly Article Recap Look through our hottest articles from this past week at How-To Geek. How to Create a Software RAID Array in Windows 7 9 Alternatives for Windows Home Server’s Drive Extender Why Doesn’t Disk Cleanup Delete Everything from the Temp Folder? Ask the Readers: How Much Do You Customize Your Operating System? How to Upload Really Large Files to SkyDrive, Dropbox, or Email One Year Ago on How-To Geek Enjoy reading through these awesome articles from one year ago. How To Upgrade from Vista to Windows 7 Home Premium Edition How To Fix No Aero Transparency in Windows 7 Troubleshoot Startup Problems with Startup Repair Tool in Windows 7 & Vista Rename the Guest Account in Windows 7 for Enhanced Security Disable Error Reporting in XP, Vista, and Windows 7 The Geek Note That wraps things up here for this week. Regardless of the weather wherever you may be, we hope that you have an opportunity to get outside and have some fun! Remember to keep sending those great tips in to us at [email protected]. Photo by Tony the Misfit. Latest Features How-To Geek ETC The How-To Geek Guide to Learning Photoshop, Part 8: Filters Get the Complete Android Guide eBook for Only 99 Cents [Update: Expired] Improve Digital Photography by Calibrating Your Monitor The How-To Geek Guide to Learning Photoshop, Part 7: Design and Typography How to Choose What to Back Up on Your Linux Home Server How To Harmonize Your Dual-Boot Setup for Windows and Ubuntu Hang in There Scrat! – Ice Age Wallpaper How Do You Know When You’ve Passed Geek and Headed to Nerd? On The Tip – A Lamborghini Theme for Chrome and Iron What if Wile E. Coyote and the Road Runner were Human? [Video] Peaceful Winter Cabin Wallpaper Store Tabs for Later Viewing in Opera with Tab Vault

    Read the article

  • Secure wipe of a hard drive using WinPE.

    - by Derek Meier
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} The wiping of a hard drive is typically seen as fairly trivial.  There are tons of applications out there that will do it for you.  Point àClickàGlobal-Thermo Nuclear War. However, these applications are typically expensive or unreliable.  Plus, if you have a laptop or lack a secondary computer to put the hard drive into – how on earth do you wipe it quickly and easily while still conforming to a 7 pass rule (this means that every possible bit on the hard drive is set to 0 and then to 1 seven times in a row)?  Yes, one pass should be enough – as turning every bit from a 1 to a zero will wipe the data from existence.  But, we’re dealing with tinfoil hat wearing types here people.  DOD standards dictate at least 3 passes, and typically 7 is the preferred amount.  I’m not going to argue about data recovery.  I have been told to use 7 passes, and so I will.  So say we all! Quite some time ago I used to make a BartPE XP-based boot cd for the original purpose of securely wiping data.  I loved BartPE and integrated so many plugins into my builds that I could do pretty much anything directly from CD.  Reset passwords, uninstall security updates, wipe drives, chkdsk, remove spyware, install Windows, etc.  However, with the newer multi-core systems and new chipsets coming out from vendors, I found that BartPE was rather difficult to keep up to date.  I have since switched to WinPE 3.0 (Windows Preinstallation Environment). http://technet.microsoft.com/en-us/library/cc748933(WS.10).aspx  It is fairly simple to create your own CD, and I have made a few helpful scripts to easily integrate drivers and rebuild the ISO file for you.  I’ll cover making your own boot CD utilizing WinPE 3.0 in a later post – I can talk about WinPE forever and need to collect my thoughts!!  My wife loves talking about WinPE almost as much as talking about Doctor Who.  Wait, did I say loves?  Hmmmm, I may have meant loathes. The topic at hand?  Right. Wiping a drive! I must have drunk too much coffee this morning.  I like to use a simple batch script that calls a combination of diskpart.exe from Microsoft® and Sdelete.exe created by our friend Mark Russinovich. http://technet.microsoft.com/en-us/sysinternals/bb897443.aspx All of the following files are located within the same directory on my WinPE boot CD. Here are the contents of wipe_me.bat, script.txt and sdelete.reg. Wipe_me.bat:   @echo off echo. echo     I will completely wipe the local hard drives using echo     7 individual wipes. The data will NOT echo     be recoverable.  I will begin after you pause echo. echo Preparing to partition and format disk. Diskpart.exe /s "script.txt" REM I was annoyed by not having a completely automated script – and Sdelete wants you to accept the license agreement. So, I added a registry file to skip doing that. regedit /S sdelete.reg rem sdelete options selected are: -p (passes) -c (zero free space) -s (recurse through subdirectories, if any) -z (clean free space) [drive letter] sdelete.exe -p 7 -c -s -z c: echo. echo Pass seven complete. echo. echo Wiping complete. Pause exit script.txt: list disk select disk 0 clean create partition primary select partition 1 active format FS=NTFS LABEL="New Volume" QUICK assign letter=c exit *Notes: This script assumes one local hard drive – change the script as you see fit for your environment.  The clean command will overwrite the master boot record and any hidden sector information – so be careful!   sdelete.reg: Windows Registry Editor Version 5.00 [HKEY_CURRENT_USER\Software\Sysinternals\SDelete] "EulaAccepted"=dword:00000001   With a combination of WinPE, sdelete.exe and your friendly neighborhood text editor you can begin wiping drives as quickly and easily as possible!  I hope this helps, I get asked this a lot in my line of work. Best of luck, Derek

    Read the article

  • Can't run Eclipse after installing ADT Plugin

    - by user89439
    So, I've installed the ADT Plugin, run a HelloWorld, restart my computer and after that the Eclipse can't run. A message appear: "An error has ocurred. See the log file: /home/todi (...)" Here is the log file: !SESSION 2011-07-26 22:51:59.381 ----------------------------------------------- eclipse.buildId=I20110613-1736 java.version=1.6.0_26 java.vendor=Sun Microsystems Inc. BootLoader constants: OS=win32, ARCH=x86, WS=win32, NL=pt_BR Framework arguments: -product org.eclipse.epp.package.java.product Command-line arguments: -os win32 -ws win32 -arch x86 -product org.eclipse.epp.package.java.product !ENTRY org.eclipse.update.configurator 4 0 2011-07-26 22:57:34.135 !MESSAGE Could not rename configuration temp file !ENTRY org.eclipse.update.configurator 4 0 2011-07-26 22:57:34.157 !MESSAGE Unable to save configuration file "C:\Program Files\eclipse\configuration\org.eclipse.update\platform.xml.tmp" !STACK 0 java.io.IOException: Unable to save configuration file "C:\Program Files\eclipse\configuration\org.eclipse.update\platform.xml.tmp" at org.eclipse.update.internal.configurator.PlatformConfiguration.save(PlatformConfiguration.java:690) at org.eclipse.update.internal.configurator.PlatformConfiguration.save(PlatformConfiguration.java:574) at org.eclipse.update.internal.configurator.PlatformConfiguration.startup(PlatformConfiguration.java:714) at org.eclipse.update.internal.configurator.ConfigurationActivator.getPlatformConfiguration(ConfigurationActivator.java:404) at org.eclipse.update.internal.configurator.ConfigurationActivator.initialize(ConfigurationActivator.java:136) at org.eclipse.update.internal.configurator.ConfigurationActivator.start(ConfigurationActivator.java:69) at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711) at java.security.AccessController.doPrivileged(Native Method) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683) at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381) at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:299) at org.eclipse.osgi.framework.util.SecureAction.start(SecureAction.java:440) at org.eclipse.osgi.internal.loader.BundleLoader.setLazyTrigger(BundleLoader.java:268) at org.eclipse.core.runtime.internal.adaptor.EclipseLazyStarter.postFindLocalClass(EclipseLazyStarter.java:107) at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.findLocalClass(ClasspathManager.java:462) at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.findLocalClass(DefaultClassLoader.java:216) at org.eclipse.osgi.internal.loader.BundleLoader.findLocalClass(BundleLoader.java:400) at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:476) at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:429) at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:417) at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107) at java.lang.ClassLoader.loadClass(Unknown Source) at org.eclipse.osgi.internal.loader.BundleLoader.loadClass(BundleLoader.java:345) at org.eclipse.osgi.framework.internal.core.BundleHost.loadClass(BundleHost.java:229) at org.eclipse.osgi.framework.internal.core.AbstractBundle.loadClass(AbstractBundle.java:1207) at org.eclipse.equinox.internal.ds.model.ServiceComponent.createInstance(ServiceComponent.java:480) at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.createInstance(ServiceComponentProp.java:271) at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.build(ServiceComponentProp.java:332) at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(InstanceProcess.java:588) at org.eclipse.equinox.internal.ds.ServiceReg.getService(ServiceReg.java:53) at org.eclipse.osgi.internal.serviceregistry.ServiceUse$1.run(ServiceUse.java:138) at java.security.AccessController.doPrivileged(Native Method) at org.eclipse.osgi.internal.serviceregistry.ServiceUse.getService(ServiceUse.java:136) at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.getService(ServiceRegistrationImpl.java:468) at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.getService(ServiceRegistry.java:467) at org.eclipse.osgi.framework.internal.core.BundleContextImpl.getService(BundleContextImpl.java:594) at org.osgi.util.tracker.ServiceTracker.addingService(ServiceTracker.java:450) at org.osgi.util.tracker.ServiceTracker$Tracked.customizerAdding(ServiceTracker.java:980) at org.osgi.util.tracker.ServiceTracker$Tracked.customizerAdding(ServiceTracker.java:1) at org.osgi.util.tracker.AbstractTracked.trackAdding(AbstractTracked.java:262) at org.osgi.util.tracker.AbstractTracked.trackInitial(AbstractTracked.java:185) at org.osgi.util.tracker.ServiceTracker.open(ServiceTracker.java:348) at org.osgi.util.tracker.ServiceTracker.open(ServiceTracker.java:283) at org.eclipse.core.internal.runtime.InternalPlatform.getBundleGroupProviders(InternalPlatform.java:225) at org.eclipse.core.runtime.Platform.getBundleGroupProviders(Platform.java:1261) at org.eclipse.ui.internal.ide.IDEWorkbenchPlugin.getFeatureInfos(IDEWorkbenchPlugin.java:291) at org.eclipse.ui.internal.ide.WorkbenchActionBuilder.makeFeatureDependentActions(WorkbenchActionBuilder.java:1217) at org.eclipse.ui.internal.ide.WorkbenchActionBuilder.makeActions(WorkbenchActionBuilder.java:1026) at org.eclipse.ui.application.ActionBarAdvisor.fillActionBars(ActionBarAdvisor.java:147) at org.eclipse.ui.internal.ide.WorkbenchActionBuilder.fillActionBars(WorkbenchActionBuilder.java:341) at org.eclipse.ui.internal.WorkbenchWindow.fillActionBars(WorkbenchWindow.java:3564) at org.eclipse.ui.internal.WorkbenchWindow.(WorkbenchWindow.java:419) at org.eclipse.ui.internal.tweaklets.Workbench3xImplementation.createWorkbenchWindow(Workbench3xImplementation.java:31) at org.eclipse.ui.internal.Workbench.newWorkbenchWindow(Workbench.java:1920) at org.eclipse.ui.internal.Workbench.access$14(Workbench.java:1918) at org.eclipse.ui.internal.Workbench$68.runWithException(Workbench.java:3658) at org.eclipse.ui.internal.StartupThreading$StartupRunnable.run(StartupThreading.java:31) at org.eclipse.swt.widgets.RunnableLock.run(RunnableLock.java:35) at org.eclipse.swt.widgets.Synchronizer.runAsyncMessages(Synchronizer.java:135) at org.eclipse.swt.widgets.Display.runAsyncMessages(Display.java:4140) at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3757) at org.eclipse.ui.application.WorkbenchAdvisor.openWindows(WorkbenchAdvisor.java:803) at org.eclipse.ui.internal.Workbench$33.runWithException(Workbench.java:1595) at org.eclipse.ui.internal.StartupThreading$StartupRunnable.run(StartupThreading.java:31) at org.eclipse.swt.widgets.RunnableLock.run(RunnableLock.java:35) at org.eclipse.swt.widgets.Synchronizer.runAsyncMessages(Synchronizer.java:135) at org.eclipse.swt.widgets.Display.runAsyncMessages(Display.java:4140) at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3757) at org.eclipse.ui.internal.Workbench.runUI(Workbench.java:2604) at org.eclipse.ui.internal.Workbench.access$4(Workbench.java:2494) at org.eclipse.ui.internal.Workbench$7.run(Workbench.java:674) at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332) at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:667) at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149) at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:123) at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196) at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110) at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79) at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:344) at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:622) at org.eclipse.equinox.launcher.Main.basicRun(Main.java:577) at org.eclipse.equinox.launcher.Main.run(Main.java:1410) !ENTRY org.eclipse.equinox.p2.operations 4 0 2011-07-27 00:15:28.049 !MESSAGE Operation details !SUBENTRY 1 org.eclipse.equinox.p2.director 4 1 2011-07-27 00:15:28.049 !MESSAGE Cannot complete the install because some dependencies are not satisfiable !SUBENTRY 2 org.eclipse.equinox.p2.director 4 0 2011-07-27 00:15:28.049 !MESSAGE org.eclipse.linuxtools.callgraph.feature.group [0.0.2.201106060936] cannot be installed in this environment because its filter is not applicable. !ENTRY org.eclipse.equinox.p2.operations 4 0 2011-07-27 00:15:28.644 !MESSAGE Operation details !SUBENTRY 1 org.eclipse.equinox.p2.director 4 1 2011-07-27 00:15:28.644 !MESSAGE Cannot complete the install because some dependencies are not satisfiable !SUBENTRY 2 org.eclipse.equinox.p2.director 4 0 2011-07-27 00:15:28.644 !MESSAGE org.eclipse.linuxtools.callgraph.feature.group [0.0.2.201106060936] cannot be installed in this environment because its filter is not applicable. !ENTRY org.eclipse.equinox.p2.operations 4 0 2011-07-27 00:27:35.152 !MESSAGE Operation details !SUBENTRY 1 org.eclipse.equinox.p2.director 4 1 2011-07-27 00:27:35.158 !MESSAGE Cannot complete the install because some dependencies are not satisfiable !SUBENTRY 2 org.eclipse.equinox.p2.director 4 0 2011-07-27 00:27:35.159 !MESSAGE org.eclipse.linuxtools.callgraph.feature.group [0.0.2.201106060936] cannot be installed in this environment because its filter is not applicable. !ENTRY org.eclipse.equinox.p2.operations 4 0 2011-07-27 00:27:35.215 !MESSAGE Operation details !SUBENTRY 1 org.eclipse.equinox.p2.director 4 1 2011-07-27 00:27:35.216 !MESSAGE Cannot complete the install because some dependencies are not satisfiable !SUBENTRY 2 org.eclipse.equinox.p2.director 4 0 2011-07-27 00:27:35.216 !MESSAGE org.eclipse.linuxtools.callgraph.feature.group [0.0.2.201106060936] cannot be installed in this environment because its filter is not applicable. !ENTRY org.eclipse.equinox.p2.operations 4 0 2011-07-27 01:07:17.988 !MESSAGE Operation details !SUBENTRY 1 org.eclipse.equinox.p2.director 4 1 2011-07-27 01:07:18.006 !MESSAGE Cannot complete the install because some dependencies are not satisfiable !SUBENTRY 2 org.eclipse.equinox.p2.director 4 0 2011-07-27 01:07:18.006 !MESSAGE org.eclipse.linuxtools.callgraph.feature.group [0.0.2.201106060936] cannot be installed in this environment because its filter is not applicable. !ENTRY org.eclipse.equinox.p2.operations 4 0 2011-07-27 01:07:19.847 !MESSAGE Operation details !SUBENTRY 1 org.eclipse.equinox.p2.director 4 1 2011-07-27 01:07:19.848 !MESSAGE Cannot complete the install because some dependencies are not satisfiable !SUBENTRY 2 org.eclipse.equinox.p2.director 4 0 2011-07-27 01:07:19.848 !MESSAGE org.eclipse.linuxtools.callgraph.feature.group [0.0.2.201106060936] cannot be installed in this environment because its filter is not applicable. I don't understand how the path windows like has appeared... if anyone knows how to solve this, I'll appreciate! Thank you for all your answers! Best regards, Alexandre Ferreira.

    Read the article

  • Solaris 10 branded zone VM Templates for Solaris 11 on OTN

    - by jsavit
    Early this year I wrote the article Ours Goes To 11 which describes the ability to import Solaris 10 systems into a "Solaris 10 branded zone" under Oracle Solaris 11. I did this using Solaris 11 Express, and the capability remains in Solaris 11 with only slight changes. This important tool lets you painlessly inhaling a Solaris Container from Solaris 10 or entire Solaris 10 systems ("the global zone") into virtualized environments on a Solaris 11 OS. Just recently, Oracle provided Oracle VM Templates for Oracle Solaris 10 Zones to let you create Solaris 10 branded zones for Solaris 11 even if you don't currently have access to install media or a running Solaris 10 system. To use this, just download the Oracle VM Template for Oracle Solaris Zone 10 from OTN at http://www.oracle.com/technetwork/server-storage/solaris11/downloads/virtual-machines-1355605.html. This page contains images of Oracle Solaris 10 8/11 (the recent update to Solaris 10) in SPARC and x86 formats suitable for creating branded zones. The same page also has a VirtualBox image you can download for a complete Solaris 10 install in a guest virtual machine you can run on any host OS that supports VirtualBox. Both sets of downloads provide a quick - and extremely easy - way to set up a virtual Solaris 10 environment. In the case of the Oracle VM Templates, they illustrate several advanced features of Solaris 11. To start, just go to the above link, download the template for the hardware platform (SPARC or x86) you want, and download the README file also linked from that page. Install prerequisites The README file tells you to install the prerequisite Solaris 11 package that implements the Solaris 10 brand. Then you can install instances of zones with that brand. # pkg install pkg:/system/zones/brand/brand-solaris10 Packages to install: 1 Create boot environment: No Create backup boot environment: Yes DOWNLOAD PKGS FILES XFER (MB) Completed 1/1 44/44 0.4/0.4 PHASE ACTIONS Install Phase 74/74 PHASE ITEMS Package State Update Phase 1/1 Image State Update Phase 2/2 That took only a few minutes, and didn't require a reboot. Install the Solaris 10 zone Now it's time to run the downloaded template file. First make it executable via the chmod command, of course. I found that (unlike stated in the README) there was no need to rename the downloaded file to remove the .bin. When you run it you provide several parameters to describe the zone configuration: -a IP address - the IP address and optional netmask for the zone. This is the only mandatory parameter. -z zonename - the name of the zone you would like to create. -i interface - the package will create an exclusive-IP zone using a virtual NIC (vnic) based on this physical interface. In my case, I have a NIC called rge0. -p PATH - specifies the path in which you want the zoneroot to be placed. In my case, I have a ZFS dataset mounted at /zones, and this will create a zoneroot at /zones/s10u10. Kicking it off, you will see a copyright message, and then messages showing progress building the zone, which only takes a few minutes. # ./solaris-10u10-x86.bin -p /zones -a 192.168.1.100 -i rge0 -z s10u10 ... ... Checking disk-space for extraction Ok Extracting in /export/home/CDimages/s10zone/bootimage.ihaqvh ... 100% [===============================] Checking data integrity Ok Checking platform compatibility The host and the image do not have the same Solaris release: host Solaris release: 5.11 image Solaris release: 5.10 Will create a Solaris 10 branded zone. Warning: could not find a defaultrouter Zone won't have any defaultrouter configured IMAGE: ./solaris-10u10-x86.bin ZONE: s10u10 ZONEPATH: /zones/s10u10 INTERFACE: rge0 VNIC: vnicZBI13379 MAC ADDR: 2:8:20:5c:1a:cc IP ADDR: 192.168.1.100 NETMASK: 255.255.255.0 DEFROUTER: NONE TIMEZONE: US/Arizona Checking disk-space for installation Ok Installing in /zones/s10u10 ... 100% [===============================] Using a static exclusive-IP Attaching s10u10 Booting s10u10 Waiting for boot to complete booting... booting... booting... Zone s10u10 booted The zone's root password has been set using the root password of the local host. You can change the zone's root password to further harden the security of the zone: being root, log into the zone from the local host with the command 'zlogin s10u10'. Once logged in, change the root password with the command 'passwd'. The nifty part in my opinion (besides being so easy), is that the zone was created as an exclusive-IP zone on a virtual NIC. This network configuration lets you enforce traffic isolation from other zones, enforce network Quality of Service, and even let the zone set its own characteristics like IP address and packet size. Independence of the zone's network characteristics from the global zone is one of the enhancements in Solaris 10 that make it easier to consolidate zones while preserving their autonomy, yet provide control in a consolidated environment. Let's see what the virtual network environment looks like by issuing commands from the Solaris 11 global zone. First I'll use Old School ifconfig, and then I'll use the new ipadm and dladm commands. # ifconfig -a4 lo0: flags=2001000849<UP,LOOPBACK,RUNNING,MULTICAST,IPv4,VIRTUAL> mtu 8232 index 1 inet 127.0.0.1 netmask ff000000 rge0: flags=1004943<UP,BROADCAST,RUNNING,PROMISC,MULTICAST,DHCP,IPv4> mtu 1500 index 2 inet 192.168.1.3 netmask ffffff00 broadcast 192.168.1.255 ether 0:14:d1:18:ac:bc vboxnet0: flags=201000843<UP,BROADCAST,RUNNING,MULTICAST,IPv4,CoS> mtu 1500 index 3 inet 192.168.56.1 netmask ffffff00 broadcast 192.168.56.255 ether 8:0:27:f8:62:1c # dladm show-phys LINK MEDIA STATE SPEED DUPLEX DEVICE yge0 Ethernet unknown 0 unknown yge0 yge1 Ethernet unknown 0 unknown yge1 rge0 Ethernet up 1000 full rge0 vboxnet0 Ethernet up 1000 full vboxnet0 # dladm show-link LINK CLASS MTU STATE OVER yge0 phys 1500 unknown -- yge1 phys 1500 unknown -- rge0 phys 1500 up -- vboxnet0 phys 1500 up -- vnicZBI13379 vnic 1500 up rge0 s10u10/vnicZBI13379 vnic 1500 up rge0 s10u10/net0 vnic 1500 up rge0 # dladm show-vnic LINK OVER SPEED MACADDRESS MACADDRTYPE VID vnicZBI13379 rge0 1000 2:8:20:5c:1a:cc random 0 s10u10/vnicZBI13379 rge0 1000 2:8:20:5c:1a:cc random 0 s10u10/net0 rge0 1000 2:8:20:9d:d0:79 random 0 # ipadm show-addr ADDROBJ TYPE STATE ADDR lo0/v4 static ok 127.0.0.1/8 rge0/_a dhcp ok 192.168.1.3/24 vboxnet0/_a static ok 192.168.56.1/24 lo0/v6 static ok ::1/128 Log into the zone The install step already booted the zone, so lets log into it. Notice how you have to be appropriately privileged to log into a zone. This is my home system so I'm being a bit cavalier, but in a production environment you can give granular control of who can login to which zones. Voila! a Solaris 10 environment under a Solaris 11 kernel. Notice the output from the uname -a and ifconfig commands, and output from a ping to a nearby host. $ zlogin s10u10 zlogin: You lack sufficient privilege to run this command (all privs required) savit@home:~$ sudo zlogin s10u10 Password: [Connected to zone 's10u10' pts/5] Oracle Corporation SunOS 5.10 Generic Patch January 2005 # uname -a SunOS s10u10 5.10 Generic_Virtual i86pc i386 i86pc # ifconfig -a4 lo0: flags=2001000849 mtu 8232 index 1 inet 127.0.0.1 netmask ff000000 vnicZBI13379: flags=1000843 mtu 1500 index 2 inet 192.168.1.100 netmask ffffff00 broadcast 192.168.1.255 ether 2:8:20:5c:1a:cc # bash bash-3.2# ifconfig -a lo0: flags=2001000849 mtu 8232 index 1 inet 127.0.0.1 netmask ff000000 vnicZBI13379: flags=1000843 mtu 1500 index 2 inet 192.168.1.100 netmask ffffff00 broadcast 192.168.1.255 ether 2:8:20:5c:1a:cc bash-3.2# ping 192.168.1.2 192.168.1.2 is alive For fun, I configured Apache (setting its configuration file in /etc/apache2) and brought it up. Easy - took just a few minutes. bash-3.2# svcs apache2 STATE STIME FMRI disabled 12:38:46 svc:/network/http:apache2 bash-3.2# svcadm enable apache2 Summary In just a few minutes, I built a functioning virtual Solaris 10 environment under by Solaris 11 system. It was... easy! While I can still do it the manual way (creating and using a system archive), this is a low-effort way to create a Solaris 10 zone on Solaris 11.

    Read the article

  • Compress Large Video Files with DivX / Xvid and AutoGK

    - by DigitalGeekery
    Have you ever recorded home video on a camcorder only to find the video size is enormous? What if you wanted to share a video clip on YouTube or another video sharing site, but the file size was bigger than the maximum upload size? Today we’ll look at a way to compress certain video files, such as MPEG and AVI, with Auto Gordian Knot (AutoGK). AutoGK is a free application that runs on Windows. It supports Mpeg1, Mpeg2, Transport Streams, Vobs, and virtually any codec used for an .AVI file. AutoGK will accept as input the following file types: MPG, MPEG, VOB, VRO, M2V, DAT, IFO, TS, TP, TRP, M2T, and AVI. Files are output as .AVI files and are converted using the DivX or XviD codecs. Installing and Using AutoGK Download and install AutoGK (link below) Open the AutoGK. You’ll need to navigate a few wizard screens, but you can just accept the defaults.   Choose your video file by clicking on the folder to the right of the Input file text box.   Browse for and select your video file and click “Open.”   For this example, we’ll be working with an .AVI file that’s 167MB in size.   The output file is copied into the same directory as the input file by default, but you can change this if you choose. If the input file is also .AVI, AutoGK will append an _agk to the output file so that the original is not overwritten. Next, you’ll see any audio tracks listed. You can unselect the check box if you’d like to remove the audio track. You can choose one of the Predefined size options… Or, select a Custom size in MB or Target Quality in percentage. For our example, we’ll be compressing our 167MB file to 35MB. Click on Advanced Settings. Here you can choose your codec, if you have a preference, as well as output resolution and output audio. If you’d like to use the DivX codec, you’ll need to download and install it separately. (See link below) Typically you’ll want to keep the defaults. Click “OK.” Now you’re ready to add your file conversion job to the Job queue. Click Add Job to add it to the queue. You can add multiple files conversions to the job queue and  convert them in one batch. Click Start to begin the conversion process. The process will begin. You’ll be able to see the progress in the Log window on the bottom left. When the conversion is complete you’ll see a “Job finished” and the total time in the log window.   Check your output file to see it’s compressed size. Test your video just to make sure the output quality is satisfactory.   Note:  Conversion times can vary greatly depending on the size of the file and your computer hardware. Files that are several GBs in size may take several hours to compress. AutoGK is no longer being actively developed but is still a wonderful DivX/XviD conversion tool. It can also be used to compress and convert non-copy protected DVDs. Downloads AutoGordianKnot DivX (optional) Similar Articles Productive Geek Tips Use Your Mac Mini as a Media Server Part 2Make Disk Cleanup Compress Older(or Newer) Files on XPMysticgeek Blog: Exclusive Look Inside Vreel – Including Interview With Vreel Founder!Friday Fun: Watch HD Video Content with MeevidConvert a DVD Movie Directly to AVI with FairUse Wizard 2.9 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Penolo Lets You Share Sketches On Twitter Visit Woolyss.com for Old School Games, Music and Videos Add a Custom Title in IE using Spybot or Spyware Blaster When You Need to Hail a Taxi in NYC Live Map of Marine Traffic NoSquint Remembers Site Specific Zoom Levels (Firefox)

    Read the article

  • Lessons from a SAN Failure

    - by Bill Graziano
    At 1:10AM Sunday morning the main SAN at one of my clients suffered a “partial” failure.  Partial means that the SAN was still online and functioning but the LUNs attached to our two main SQL Servers “failed”.  Failed means that SQL Server wouldn’t start and the MDF and LDF files mostly showed a zero file size.  But they were online and responding and most other LUNs were available.  I’m not sure how SANs know to fail at 1AM on a Saturday night but they seem to.  From a personal standpoint this worked out poorly: I was out with friends and after more than a few drinks.  From a work standpoint this was about the best time to fail you could imagine.  Everything was running well before Monday morning.  But it was a long, long Sunday.  I started tipsy, got tired and ended up hung over later in the day. Note to self: Try not to go out drinking right before the SAN fails. This caught us at an interesting time.  We’re in the process of migrating to an entirely new set of servers so some things were partially moved.  This made it difficult to follow our procedures as cleanly as we’d like.  The benefit was that we had much better documentation of everything on the server.  I would encourage everyone to really think through the process of implementing your DR plan and document as much as possible.  Following a checklist is much easier than trying to remember at night under pressure in a hurry after a few drinks. I had a series of estimates on how long things would take.  They were accurate for any single server failure.  They weren’t accurate for a SAN failure that took two servers down.  This wasn’t bad but we should have communicated better. Don’t forget how many things are outside the database.  Logins, linked servers, DTS packages (yikes!), jobs, service broker, DTC (especially DTC), database triggers and any objects in the master database are all things you need backed up.  We’d done a decent job on this and didn’t find significant problems here.  That said this still took a lot of time.  There were many annoyances as a result of this.  Small settings like a login’s default database had a big impact on whether an application could run.  This is probably the single biggest area of concern when looking to recreate a server.  I’d encourage everyone to go through every single node of SSMS and look for user created objects or settings outside the database. Script out your logins with the proper SID and already encrypted passwords and keep it updated.  This makes life so much easier.  I used an approach based on KB246133 that worked well.  I’ll get my scripts posted over the next few days. The disaster can cause your DR process to fail in unexpected ways.  We have a job that scripts out all logins and role memberships and writes it to a file.  This runs on the DR server and pulls from the production server.  Upon opening the file I found that the contents were a “server not found” error.  Fortunately we had other copies and didn’t need to try and restore the master database.  This now runs on the production server and pushes the script to the DR site.  Soon we’ll get it pushed to our version control software. One of the biggest challenges is keeping your DR resources up to date.  Any server change (new linked server, new SQL Server Agent job, etc.) means that your DR plan (and scripts) is out of date.  It helps to automate the generation of these resources if possible. Take time now to test your database restore process.  We test ours quarterly.  If you have a large database I’d also encourage you to invest in a compressed backup solution.  Restoring backups was the single larger consumer of time during our recovery. And yes, there’s a database mirroring solution planned in our new architecture. I didn’t have much involvement in things outside SQL Server but this caused many, many things to change in our environment.  Many applications today aren’t just executables or web sites.  They are a combination of those plus network infrastructure, reports, network ports, IP addresses, DTS and SSIS packages, batch systems and many other things.  These all needed a little bit of attention to make sure they were functioning properly. Profiler turned out to be a handy tool.  I started a trace for failed logins and kept that running.  That let me fix a number of problems before people were able to report them.  I also ran traces to capture exceptions.  This helped identify problems with linked servers. Overall the thing that gave me the most problem was linked servers.  In order for a linked server to function properly you need to be pointed to the right server, have the proper login information, have the network routes available and have MSDTC configured properly.  We have a lot of linked servers and this created many failure points.  Some of the older linked servers used IP addresses and not DNS names.  This meant we had to go in and touch all those linked servers when the servers moved.

    Read the article

< Previous Page | 142 143 144 145 146 147 148 149 150 151 152 153  | Next Page >