Search Results

Search found 5718 results on 229 pages for 'resource'.

Page 124/229 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • Typesafe fire-and-forget asynchronous delegate invocation in C#

    - by LBushkin
    I recently found myself needing a typesafe "fire-and-forget" mechanism for running code asynchronously. Ideally, what I would want to do is something like: var myAction = (Action)(() => Console.WriteLine("yada yada")); myAction.FireAndForget(); // async invocation Unfortunately, the obvious choice of calling BeginInvoke() without a corresponding EndInvoke() does not work - it results in a slow resource leak (since the asyn state is held by the runtime and never released ... it's expecting an eventual call to EndInvoke(). I also can't run the code on the .NET thread pool because it may take a very long time to complete (it's advised to only run relatively short-lived code on the thread pool) - this makes it impossible to use the ThreadPool.QueueUserWorkItem(). Initially, I only needed this behavior for methods whose signature matches Action, Action<...>, or Func<...>. So I put together a set of extension methods (see listing below) that let me do this without running into the resource leak. There are overloads for each version of Action/Func. Unfortunately, I now want to port this code to .NET 4 where the number of generic parameters on Action and Func have been increased substantially. Before I write a T4 script to generate these, I was also hoping to find a simpler more elegant way to do this. Any ideas are welcome. public static class AsyncExt { public static void FireAndForget( this Action action ) { action.BeginInvoke(OnActionCompleted, action); } public static void FireAndForget<T1>( this Action<T1> action, T1 arg1 ) { action.BeginInvoke(arg1, OnActionCompleted<T1>, action); } public static void FireAndForget<T1,T2>( this Action<T1,T2> action, T1 arg1, T2 arg2 ) { action.BeginInvoke(arg1, arg2, OnActionCompleted<T1, T2>, action); } public static void FireAndForget<TResult>(this Func<TResult> func, TResult arg1) { func.BeginInvoke(OnFuncCompleted<TResult>, func); } public static void FireAndForget<T1,TResult>(this Func<T1, TResult> action, T1 arg1) { action.BeginInvoke(arg1, OnFuncCompleted<T1,TResult>, action); } // more overloads of FireAndForget<..>() for Action<..> and Func<..> private static void OnActionCompleted( IAsyncResult result ) { var action = (Action)result.AsyncState; action.EndInvoke(result); } private static void OnActionCompleted<T1>( IAsyncResult result ) { var action = (Action<T1>)result.AsyncState; action.EndInvoke( result ); } private static void OnActionCompleted<T1,T2>(IAsyncResult result) { var action = (Action<T1,T2>)result.AsyncState; action.EndInvoke(result); } private static void OnFuncCompleted<TResult>( IAsyncResult result ) { var func = (Func<TResult>)result.AsyncState; func.EndInvoke( result ); } private static void OnFuncCompleted<T1,TResult>(IAsyncResult result) { var func = (Func<T1, TResult>)result.AsyncState; func.EndInvoke(result); } // more overloads of OnActionCompleted<> and OnFuncCompleted<> }

    Read the article

  • Putting our OLTP and OLAP services on the same cluster

    - by Dynamo
    We're currently in a bit of a debate about what to do with our scattered SQL environment. We are setting up a cluster for our data warehouses for sure and are now in the process of deciding if our OLTP databases should go on the same one. The cluster will be active/active with database services running on one node and reporting and analytical services on the other node. From a technical standpoint I don't see an issue here. With the services being run on different nodes they shouldn't compete too heavily for resources. The only physical resource that may be an issue would be the shared disk space. Our environment is also quite small. Our biggest OLAP database at the moment is only about 40GB and our OLTP are all under 10GB. I see a potential political issue here as different groups are involved but I'm just strictly wondering if there would be any major technical issues that could arise from this setup.

    Read the article

  • IIS 7.5 and ASP.NET MVC Routing

    - by m__
    I'm running an ASP.NET MVC 3 application on an IIS 7.5 server (my development server). When I set up my production server something goes wrong. Serving the same application binaries, using the same web.config file and connecting to the same database I get different results. Something must be wrong with my IIS configuration, but what? Here's an example I visit http://mysite.com/An/AspNetMvc/Routed/Address/1 and everything works I visit http://mysite.com/An/AspNetMvc/Routed/Address/1.1 works on my development server but not on my production server which gives the following error HTTP Error 404.0 - Not Found The resource you are looking for has been removed, had its name changed, or is temporarily unavailable. Module IIS Web Core Notification MapRequestHandler Handler StaticFile Error Code 0x80070002 Somehow the URL is served as a static file on my production server which led me to investigate my IIS Handler Mappings, but without luck.

    Read the article

  • Cannot access internet or remote network after connecting to Windows VPN

    - by Kiewic
    I set up a VPN by creating an incoming VPN connection (VPN server) in my Windows 8 machine at home (not a Windows Server). I forwarded the PPTP port in my router (port 1723) to this machine and enabled PPTP passthrough. In a second Windows 8 machine out of home, I created an outgoing VPN connection (VPN client). And I am able to connect to my home VPN, but I don't have access to any home resource or even internet. This is the output of the client ipconfig: And this are the settings of my VPN server: UPDATE: My VPN server has assigned the 192.168.1.144 IP adress at my home network. So, I tried setting the "IP address assignment" range from 192.168.1.150 to 192.168.1.200. And when a VPN client gets connected, it gets an address in that range, but it doesn't make any difference.

    Read the article

  • fetching rss feed, mktime problem (wordpress plugin)

    - by krike
    sorry for the weird question but actually I'm not sure how to ask this. the code works fine but the problem is the following. I fetch items from feeds and compare them to a specific date which is stored into the database as an option. I see if the fetched item is more recent or not and if it is I create a new pending post. however there is one item that just keeps being added because there is something wrong with the date and I don't understand what is wrong because all other items (before and after) are blocked once they are being added as post (so when they are not recent anymore compared with the date I store as an option) I then echo'ed that line and this is the result: yes it exist! 27-03-2010-04-03 Free Exclusive Vector Icon Pack: Web User Interface 29-01-2010-03-01 if(1330732800 < 1335830400) 01 05 2012 the first line is to see if the option still exist and what data was stored in it the second is the actual items that's causing all the problem, it's initial date is less recent then the date but still passes as more recent.... this is the code I use, any help would be great: if(!get_option('feed_last_date')): echo "nope doesn't exist, but were creating one"; $time = time()-60*60*24*30*3; add_option("feed_last_date", date('d-m-Y-h-m', $time)); else: echo "yes it exist! ".get_option('feed_last_date'); endif; //Create new instance of simple pie and pass in feed link $feed = new SimplePie($feeds); //Call function handle_content_type() $feed->handle_content_type(); $last_date = explode("-", get_option('feed_last_date')); $day = $last_date[0]; $month = $last_date[1]; $year = $last_date[2]; $hour = $last_date[3]; $minute = $last_date[4]; $last_date = mktime(0,0,0, $day, $month, $year); //4 - Loop through received items foreach($feed->get_items(0, 500) as $item) : $feed_date = explode("-", $item->get_date('d-m-Y-h-m')); $day = $feed_date[0]; $month = $feed_date[1]; $year = $feed_date[2]; $hour = $feed_date[3]; $minute = $feed_date[4]; $feed_date = mktime(0,0,0,$day, $month, $year); if($last_date < $feed_date): echo $item->get_title()." ". $item->get_date('d-m-Y-h-m') ." if(".$last_date." < ".$feed_date.") <b>".date("d m Y", $feed_date)."</b><br />"; //if datum is newer then stored in database, then create new post $description = "<h3>download the resource</h3><p><a href='".$item->get_permalink()."' target='_blank'>".$item->get_permalink()."</a></p><h3>Resource description</h3><p>".$item->get_description()."</p>"; $new_post = array( 'post_title' => $item->get_title(), 'post_content' => $description, 'post_status' => 'pending', 'post_date' => date('Y-m-d H:i:s'), 'post_author' => $user_ID, 'post_type' => 'post', 'post_category' => array(0) ); $post_id = wp_insert_post($new_post); endif; endforeach; update_option("feed_last_date", date('d-m-Y-h-m', time()));

    Read the article

  • How to split file on Windows 2003 using MS supported tool

    - by Rune
    Hi, Is it possible to split a large file into smaller files on Windows 2003 using a tool provided/supported/sanctioned by Microsoft? I see that there are a lot of freeware tools (various zip tools) for this task, but I need to move files off of a production server, thus would like to avoid tools I don't know if I can trust. I would much prefer some tool included in the Windows Server 2003 Resource Kit Tools or something along those lines. Does such a tool exist? Thank you.

    Read the article

  • users connecting to remote desktop to use software

    - by Jordan
    Im an IT assistant at a CNC milling company and we use a program called made2manage. Its an ERP (enterprise resource management) software. Each license is something like 5k and instead of giving each employee there own copy of the software he has everyone that uses the program connect to a server that has a copy of m2m on it. Its slow when there are a bunch of people connected to it. But I guess they dont want to buy more licenses. Is there a better way to do something like this? How bad of a practice is this?

    Read the article

  • incorrect DNS entries on server 2008 r2

    - by user137841
    On the main DC (windows server 2008 R2 standard) in our network I have to clear some old DNS entries every now and then in the Forward and Reverse Lookup Zones. I have set the server Aging/Scavenging settings to Scavenge stale resource records, with both the No-refresh interval and Refresh interval to 5 days. Yet every now and then I still have to logon to the sever and remove the DNS entries for computers that is not part of the domain anymore or has been renamed ect. Is there a different way to automatically remove the old entries?

    Read the article

  • Error when I am trying to run Activiti BPM explorer

    - by test test
    I am facing the following problem : I have downloaded Activiti BPM which runs under Apache. I have installed both; Java jre7 & Java jdk1.7.0_06. I set the JAVA_HOME to be C:\Program Files\Java\jdk1.7.0_06. But when I try to run the Activiti BPM by typing the following in windows 7 command line : C:\activiti-5.10\activiti-5.10\setup>ant demo.start, the Tomcat server will start and the demo will build successfully, but if I try to navigate to the following link http://localhost:8080/activiti-explorer I get the following error : HTTP Status 404 - /activiti-explorer type Status report message /activiti-explorer description The requested resource (/activiti-explorer) is not available. Apache Tomcat/6.0.32

    Read the article

  • MS project publishing to TFS web portal display

    - by denis bastarache
    So, when we initially created our MPP schedule, I made use of indends / subordinates to break down the project by the various stages of the lifecycle, which is fine... no issues there... But now that I'm trying to publish this over to TFS display, it'll only pick up the actual "action items / sub-tasks" seeing as I have resource allocation specified. So for example I have an "Analysis" phase with a few items underneath, and "System Requirements" phase with the same items, so when I publish these to TFS, it won't display the "Parent" distinction between items, so both "Tasks" instances are being published in TFS under the exact same name... So, if I can't do this Automatically, I'll likely have edit each tasks with "Analysis - Item 1", "Analysis - item 2", "SRD - Item 1", "SRD - item 2"... is there a way to do this automatically, or will have to go the manual route??

    Read the article

  • iTunes memory usage

    - by Jordan S. Jones
    Why does iTunes use upwards of 70 megs of ram when it is minimized to my system tray playing music? -- Update -- I understand that iTunes is a resource hog :) What I'm trying to find out, is what part of iTunes is using all that ram. Is it the music library? If I have a smaller music library, will it use less ram? Is it loading all the Album Artwork into ram for some dumb reason? Additionally, is there any recommendations on what someone could do to reduce the amount of ram it is using?

    Read the article

  • BES 5.0 and MAPI calls to exchange system

    - by nysingh
    We have been using BES 4.1(5) for a while now and it has been a resource hog on exchange due to high number of MAPI calls. I have heard that BES 5.0 is even worse. the comparison i heard is that BES 4.1 is makes MAPI calls equal to 5 outlook clients per BB user and BES 5.0 makes MAPI calls equal to 10 outlook clients per BB user. can someone confirm if it is true? is BES 5.0 is really that bad in MAPI calls and for exchange performance. ? thanks

    Read the article

  • GNU Screen Draw Lag

    - by Daeden
    I like using screen with multiple splits. I usually like 3 sections Resource Monitoring using HTop Text Editor using VIM Command line using Bash My issue is that, when I am doing something that writes a good deal of text to STDOUT like running Make and if I am focused on that section, Screen lags on me. So much so, that the other sections no longer update and screen is not responsive to commands like CTRL-A + TAB. I'm not entirely sure what the problem is, but it appears to have something to do with the cursor location which blinks wildly while this is happening. I'm aware that using the vertical split functionality of Screen can lead to lag, but is this the cause? If so, is there a way to fix it aside from redirecting STDOUT?

    Read the article

  • .VBS scripts have stopped running... No idea why!

    - by Django Reinhardt
    We have two .vbs scripts that are run by our Task Scheduler that have suddenly stopped working for no reason we can fathom. We haven't significantly altered our system configuration in the last 24 hours, and the scripts have run without a hitch for months. According to the Task Scheduler the scripts just keep running and never stop, which is never the case. I stopped all running versions through the Scheduler and manually attempted to run one of the .vbs scripts. I got the following error message: Line: 15 Error: The system cannot locate the resource specified. Code: 800C0005 Source: msxml3.dll Line 15 (or 16 to be more accurate - line 15 itself is blank, but so is line 1) is: xml.Send Would could have suddenly caused this? Looking in system32\ and sysWOW64\ shows that msxml3.dll exists. Anybody got any ideas? Thanks a lot!

    Read the article

  • How to setup Wordpress High Availability

    - by Ketam
    I have installed Galera Cluster on 3 cluster + 1 management. I wanted to make it like this, Server1: Home (www.domain.com) Server2: For BBpress/Forum (Forum Tab Menu will forward to forum.domain.com) Server3: BuddyPress Activity (Social Tab Menu will forward to social.domain.com) The purpose I am doing this is to distribute my resource and load balancing each other at same time. However, I have difficulty to setup Apache Load-Balancing/mod_proxy/clustering or any suitable to have high availability WordPress. Any best suggestion/solution to make high availability WordPress? Or how to? And another question is I tried to copy whole WordPress files & folders to Server2 connecting to local database (same data inside since it is already on Galera Cluster) but the page blank. Any advice? OS: Centos 6.2 Thanks in advanced.

    Read the article

  • TORQUE: Find out why the job is queued?

    - by ahmad
    Under TORQUE-MAUI job scheduling system, there are several reasons to have a job stayed in Q state. Those that I know are: There are not enough resource to run the Queued job. The user is not allowed to have further running jobs. The user is not allowed to have further computation cores. Is there any other reason to have the queued job? Further information: I am asking the question because I have couple of queued jobs while some nodes are free, pbs_mom is on on them, and mauid limitations are generously large. Thanks in advance.

    Read the article

  • Intuitive view of what's using the hard drive so much on Windows 7?

    - by Aren Cambre
    Sometimes my hard drive usage is near 100%, and I have no idea what is causing it. Are there any utilities that can help diagnose excessive hard drive usage and have as intuitive of an interface as Task Manager's Processes tab, which I can sort by CPU usage? I am aware of using procmon, of adding columns to Task Manager's Processes tab like I/O Read Bytes and I/O Write Bytes, and using Resource Monitor's Disk tab. Too often, these don't give me useful information or clearly identify a single process that is hogging the disk.

    Read the article

  • disable browser localization

    - by broiyan
    How do I get the websites that I visit to stop localizing the language probably according to my IP location? This is an website specific issue because, for example economist.com and superuser.com do not do it, but Google Checkout and craigslist.org are doing it. Is there a way to setup Ubuntu and Firefox so that English will always be used for all web pages displayed? Edit: Of course many webpages have a link to an English version, but sometimes they don't. For example I believe such links usually appear on the root resource but sometimes I see non-English languages on child resources where such links do not appear. Example: most Blogger.com blogs appear in English but when I go to the blogger's profile ("view my complete profile"), it appears in another language that matches my geographic location.

    Read the article

  • win2003 server problem

    - by Tavo
    I used to have a server with win2000 OS. Yesteday I updated it to win2003 server R2 service pack 2. My problem now is that in a particular moment nobody can access the server. All of the shared resource of the server are unavailable. The only solution is to restart the server although the server is running fine. I don't think that is a connection problem of the LAN, because with the win2000 this problem didn't happend. The PC is connected to the same switch and the PC have the same exact hardware.

    Read the article

  • Delete temporary files from batch script in xp

    - by Keith Bentrup
    I'm looking for a good batch script that would quickly find & clean all the known safe temporary folders/files from Windows (as many variants as possible) machines (e.g. the windows temp folder, all users IE temp folders, etc.). I'm fond of UI tools like CCleaner (over Cleanmgr.exe), but when I'm trying to clean several computers quickly and/or with minimal involvement, it would be nice to have a script. Plus with a script, I could chain several scripts together. Maybe one to then fire up various antivirus and/or malware detectors. Anyone have a good one or can point to a good resource?

    Read the article

  • Will USMT 4.0 in MDT 2010 Move/Migrate the .NK2 File for Outlook?

    - by Mitch
    We're about to begin a refresh project for about 100 XP Pro laptops and have a concern with regards to the .NK2 file which holds cached email addresses(?). If possible we'd like to have USMT move/migrate this but I can't find anything that confirms that this happens automatically or has been done before. I see lots of manual processes but at this point I'm not sure that we can use that. Has anyone done this or seen this done? Perhaps you can point me to a resource that can give me an idea how its done? Any information would be appreciated. USMT seems to get a lot of the details but missing this part seems odd. Thanks in advance for any responses.

    Read the article

  • Firefox generating error?

    - by Lynda
    I run Firebug on my computer since I develop websites. And I have been noticing this error consistently with every page I go to and I am lost as to what it is and believe it might be Firefox causing this error. Has anyone seen it before? Here is the error: An exception occurred. Traceback (most recent call last): File "resource://jid1-g0j5yenav9jwla-at-jetpack-api-utils-lib/tabs/tab.js", line 254, in null .getInterface(Ci.nsIWebNavigation) Error: Permission denied for <http://superuser.com> to create wrapper for object of class UnnamedClass

    Read the article

  • windows 7 is not working with Ubuntu 12.04

    - by Anand Soni
    I have one problem, I am unable to start windows 7 which is installed with Ubuntu 12.04 because I want to utilize all resource in diff. OS. Let me tell you the installation I have done. I have installed everything with USB because my CD ROM is damaged. 1) I have installed Windows 7 in c: then I have installed Ubuntu 12.04 in C: Now when I start my PC it shows list of OS first like -Ubuntu 12.04 -Windows 7 When I click on Windows 7 it do not starts and again shows the list of OS. I have also tried to reinstall the windows 7 using USB but it gives error of something UI or Boot configuration error. Even now Ubuntu 12.04 is also not reinstalling. Now what I want to do is install windows 7 and install ubuntu 12.04 in virtual box so that I can run both system as this is not working. But can't because I can't open Window 7 even can't format my drive. Please suggest something.

    Read the article

  • Why is ext3 so slow to delete large files?

    - by Janis Peisenieks
    I have a server, which makes an incremental backup of a system every night. Now on saturdays, there is a full backup. But after the full backup has finished, a script kicks in, that deletes the incrementals. Now, the script sometimes breaks, and it is because the incrementals are each about 10GB files, and sometimes takes too long for the script. Now could someone explain to me, or point me in the direction of a resource, that explains why ext3 is so slow to delete files, when compared to, lets say, NTFS? I know theses are 2 completely different file systems, but I'm really interested why is there such a big difference in deletion?

    Read the article

  • vps running out of memory, 200MB free

    - by demon
    At the beginning of this year I took a VPS for my website because I was running against the resource limits from a shared hosting. Here are the things I know: 2GB memory, with 1GB swap Debian X64 server ED installed Software running on the webserver: mysql apache postfix pop3 imap amavisd clamd cron fail2ban munin-node pure-ftpd spamd nginx Now for the setup: Nginx listens on port 80 and handles the static files, the php side is done by apache2 running mod_php in combi with apc(no var caching!). Iam using a pretty 'busy' drupal and phpbb stack on the server, for drupal iam using boost and authcache to handle of the server load with a pressflow stack. phpbb is just phpbb3 with some mods installed, but has at max 30 users online at a time.. The problem is that its staring to use the swap after a few days after a reboot and thus the site becomes slower. I'v added pictures of monit and munin, so maybe somebody can help me out... Monit: Munin:

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >