Search Results

Search found 14601 results on 585 pages for 'ms live at edu'.

Page 81/585 | < Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >

  • Application deployment problem

    - by Indranil Mutsuddy
    Hello Everyone, I developed an application using VS 2008 and MS Access2007 and it works fine. Now have to make a setup of it(this is my first project). I gone through many tutorials about deployment, I tried VS 2008 setup and deployment, but after installation it only runs in my machine and not in others..sometimes it shows error(The 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine(that machine had both VS2008 and MS Access installed)). I been a week since, i tried what i can and still trying, cant believe that i am strucked here, nothing seems to work. Please help... The link below is my project, so if any of you could spare a little time to check project. 2_GameOnStart.html"http://www.4shared.com/file/7G14MULL/2_GameOnStart.html Thanking You all in advance. Regards Indranil

    Read the article

  • Windows 8 folder to folder sync software

    - by Danny
    I'm looking for direct folder to folder synchronization in Windows 8. I was previously using Live Mesh to accomplish this, but now it looks like that is no longer an option. Note that I'm talking about direct folder to folder sync between different computers, not syncing to the cloud. I'm aware of products like Google Drive, SkyDrive, Dropbox, etc. The problem with them is the space limitation. Basically, I was syncing important files before between my desktop and all of my laptops. One folder for example is My Pictures. This folder has almost 40 gigs of files, which is why the options listed above are not going to work for me. Just need direct syncing, nothing stored on the cloud. I was told by a Microsoft employee that SkyDrive would be replacing Mesh and would provide all the same functionality. So far this looks to be completely false, since the ability to remote desktop is gone along with folder to folder sync. Unless I'm just missing something?

    Read the article

  • What's the fastest way to check the availability of a SQL Server server?

    - by mwolfe02
    I have an MS Access program in use in multiple locations. It connects to MS SQL Server tables, but the server name is different in each location. I am looking for the fastest way to test for the existence of a server. The code I am currently using looks like this: ShellWait "sc \\" & ServerName & " qdescription MSSQLSERVER > " & Qt(fn) FNum = FreeFile() Open fn For Input As #FNum Line Input #FNum, Result Close #FNum Kill fn If InStr(Result, "SUCCESS") Then ... ShellWait: executes a shell command and waits for it to finish Qt: wraps a string in double quotes fn: temporary filename variable I run the above code against a list of server names (of which only one is normally available). The code takes about one second if the server is available and takes about 8 seconds for each server that is unavailable. I'd like to get both of these lower, if possible, but especially the fail case as this one happens most often.

    Read the article

  • My server is slower than the average user's computer, should I still offload Access queries to SQL Server? [closed]

    - by andrewb
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Databases I have a database set up with MS Access 2007 front ends and an SQL Server 2005 back end. At the moment, all the queries are saved in the front end as I've only recently moved to an SQL Server backend. I'm wondering how much of those queries I should save as stored procedures/views on SQL Server. About the system The number of concurrent users is only a handful, though it could be as high as 25 at one time (very unlikely). The average computer has an Intel i3-2120 CPU running at 3.3 GHz, which gets a PassMark score of 3,987, whilst the server has an Intel Xeon E5335 running at 2.0 GHz, which gets a PassMark score of 2,637. Always an awkward situation when an i3 outperforms a Xeon... though the i3 is from Q1 2011 and the Xeon is Q2 2009. There is potential for a server upgrade in the future, though it wouldn't come easy. I'm inclined to move the queries to the back end, as they are beginning to take noticeable time and I figure that is a better way of doing things. I like the idea of throwing everything at the server, then pushing for a server upgrade. It makes more sense in my mind to be upgrading one server rather than 30 PCs. Or am I being overzealous? Why my question isn't a duplicate It seems that my question has been misinterpreted and labelled a duplicate of quite a different question, one about testing and capacity planning. I'll try explain how my question is very different from the linked question. The crux of my question is something like "Even though my server is technically slower, is it better to have it doing more of the queries?" There's two ways that people could have answered this: I agree the server is going to be slower, but the extra benefits of such and such (like the less Access the better) means you should move most to the server anyway. (OR no it doesn't outweigh the benefit, keep them in Access) Actually the server will be faster because of such and such. I'm hoping that people out there could provide some answers like this, and the question in the dupe link doesn't really provide either of these answers. Ok sure, I suppose I could do extensive performance testing to compare Access queries running on a local machine to SQL Server queries running on the server, but that sounds like a very hard task (particularly performance testing of access) compared to someone giving some quick general guidance, and again, my question is looking for a lot more than immediate performance benefit.

    Read the article

  • User given a login prompt when closing Word documents after viewing them in IE7

    - by Martin Owen
    When using IE7 to view Word documents on our CRM system (an ASP.NET 2.0 application running on Windows Server 2003 and IIS 6 and using Windows authenticaton) I'm finding that a prompt appears when the user closes the document. The Word document is originally opened by clicking a link in the CRM system. Are there permissions that I can set on the folder containing the Word documents to prevent this prompt? I've already tried only allowing the Read permission for the Users group (I've left Administrators with Full Control.) If there's another solution to this without using permissions please let me know. UPDATE: I ran Fiddler as suggested by JD and here is the output from the two responses after the request for the document. The first seems to be a DAV response and the second is the authentication request. How do I prevent the DAV response and just return the .doc on the server? OPTIONS / HTTP/1.1 Translate: f User-Agent: Microsoft Data Access Internet Publishing Provider Protocol Discovery Host: <REMOVED> Content-Length: 0 Connection: Keep-Alive Pragma: no-cache X-NovINet: v1.2 HTTP/1.1 200 OK Date: Thu, 18 Feb 2010 13:37:36 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET MS-Author-Via: DAV Content-Length: 0 Accept-Ranges: none DASL: <DAV:sql> DAV: 1, 2 Public: OPTIONS, TRACE, GET, HEAD, DELETE, PUT, POST, COPY, MOVE, MKCOL, PROPFIND, PROPPATCH, LOCK, UNLOCK, SEARCH Allow: OPTIONS, TRACE, GET, HEAD, COPY, PROPFIND, SEARCH, LOCK, UNLOCK Cache-Control: private ------------------------------------------------------------------ OPTIONS /docs/ZONE%20100-105.doc HTTP/1.1 Translate: f User-Agent: Microsoft Data Access Internet Publishing Provider Protocol Discovery Host: <REMOVED> Content-Length: 0 Connection: Keep-Alive Pragma: no-cache X-NovINet: v1.2 HTTP/1.1 401 Unauthorized Content-Length: 83 Content-Type: text/html Server: Microsoft-IIS/6.0 WWW-Authenticate: Basic realm="<REMOVED>" X-Powered-By: ASP.NET Date: Thu, 18 Feb 2010 13:37:36 GMT ------------------------------------------------------------------ UPDATE 2: I found a potential workaround for the problem via this post: http://forums.iis.net/p/1149091/1868317.aspx. I moved all of the documents that are being requested into a folder outside of the web root, and created a virtual directory for it (also outside of the web root). When I followed a link to one of the documents in IE and then closed the document I wasn't presented with a login prompt. I should point out that I'm not using FPSE, unlike the person in the forum post. Ideally I don't want to have to put the documents in a separate virtual directory, but this is the simplest solution I've found so far.

    Read the article

  • How to make Net::Msmgr run?

    - by codeholic
    There's Net::Msmgr module on CPAN. It's written clean and the code looks trustworthy at the first glance. However this module seems to be beta and there is little documentation and no tests :-/ Has anyone used this module in production? I haven't managed to make it run by now, because it requires all event loop processing to be done in the application and as I've already said there is little documentation and no working examples to study. That's where I've gone so far: #!/usr/bin/perl use strict; use warnings; use Event; use Net::Msmgr::Object; use Net::Msmgr::Session; use Net::Msmgr::User; use constant DEBUG => 511; use constant EVENT_TIMEOUT => 5; # seconds my ($username, $password) = qw/[email protected] my.password/; my $buddy = '[email protected]'; my $user = Net::Msmgr::User->new(user => $username, password => $password); my $session = Net::Msmgr::Session->new; $session->debug(DEBUG); $session->login_handler(\&login_handler); $session->user($user); my $conv; sub login_handler { my $self = shift; print "LOGIN\n"; $self->ui_state_nln; $conv = $session->ui_new_conversation; $conv->invite($buddy); } our %watcher; sub ConnectHandler { my ($connection) = @_; warn "CONNECT\n"; my $socket = $connection->socket; $watcher{$connection} = Event->io(fd => $socket, cb => [ $connection, '_recv_message' ], poll => 're', desc => 'recv_watcher', repeat => 1); } sub DisconnectHandler { my $connection = shift; print "DISCONNECT\n"; $watcher{$connection}->cancel; } $session->connect_handler(\&ConnectHandler); $session->disconnect_handler(\&DisconnectHandler); $session->Login; Event::loop(); That's what it outputs: Dispatch Server connecting to: messenger.hotmail.com:1863 Dispatch Server connected CONNECT Dispatch Server >>>VER 1 MSNP2 CVR0 --> VER 1 MSNP2 CVR0 Dispatch Server >>>USR 2 MD5 I [email protected] --> USR 2 MD5 I [email protected] Dispatch Server <<<VER 1 CVR0 <-- VER 1 CVR0 And that's all, here it hangs. The handler on login is not being triggered. What am I doing wrong?

    Read the article

  • How to make Net::Msmnr run?

    - by codeholic
    There's Net::Msmgr module on CPAN. It's written clean and the code looks trustworthy at the first glance. However this module seems to be beta and there is little documentation and no tests :-/ Has anyone used this module in production? I haven't managed to make it run by now, because it requires all event loop processing to be done in the application and as I've already said there is little documentation and no working examples to study. That's where I've gone so far: #!/usr/bin/perl use strict; use warnings; use Event; use Net::Msmgr::Object; use Net::Msmgr::Session; use Net::Msmgr::User; use constant DEBUG => 511; use constant EVENT_TIMEOUT => 5; # seconds my ($username, $password) = qw/[email protected] my.password/; my $buddy = '[email protected]'; my $user = Net::Msmgr::User->new(user => $username, password => $password); my $session = Net::Msmgr::Session->new; $session->debug(DEBUG); $session->login_handler(\&login_handler); $session->user($user); my $conv; sub login_handler { my $self = shift; print "LOGIN\n"; $self->ui_state_nln; $conv = $session->ui_new_conversation; $conv->invite($buddy); } our %watcher; sub ConnectHandler { my ($connection) = @_; warn "CONNECT\n"; my $socket = $connection->socket; $watcher{$connection} = Event->io(fd => $socket, cb => [ $connection, '_recv_message' ], poll => 're', desc => 'recv_watcher', repeat => 1); } sub DisconnectHandler { my $connection = shift; print "DISCONNECT\n"; $watcher{$connection}->cancel; } $session->connect_handler(\&ConnectHandler); $session->disconnect_handler(\&DisconnectHandler); $session->Login; Event::loop(); That's what it outputs: Dispatch Server connecting to: messenger.hotmail.com:1863 Dispatch Server connected CONNECT Dispatch Server >>>VER 1 MSNP2 CVR0 --> VER 1 MSNP2 CVR0 Dispatch Server >>>USR 2 MD5 I [email protected] --> USR 2 MD5 I [email protected] Dispatch Server <<<VER 1 CVR0 <-- VER 1 CVR0 And that's all, here it hangs. The handler on login is not being triggered. What am I doing wrong?

    Read the article

  • Why does Perl's Net::Msmgr hang when I try to authenticate?

    - by codeholic
    There's Net::Msmgr module on CPAN. It's written clean and the code looks trustworthy at the first glance. However this module seems to be beta and there is little documentation and no tests :-/ Has anyone used this module in production? I haven't managed to make it run by now, because it requires all event loop processing to be done in the application and as I've already said there is little documentation and no working examples to study. That's where I've gone so far: #!/usr/bin/perl use strict; use warnings; use Event; use Net::Msmgr::Object; use Net::Msmgr::Session; use Net::Msmgr::User; use constant DEBUG => 511; use constant EVENT_TIMEOUT => 5; # seconds my ($username, $password) = qw/[email protected] my.password/; my $buddy = '[email protected]'; my $user = Net::Msmgr::User->new(user => $username, password => $password); my $session = Net::Msmgr::Session->new; $session->debug(DEBUG); $session->login_handler(\&login_handler); $session->user($user); my $conv; sub login_handler { my $self = shift; print "LOGIN\n"; $self->ui_state_nln; $conv = $session->ui_new_conversation; $conv->invite($buddy); } our %watcher; sub ConnectHandler { my ($connection) = @_; warn "CONNECT\n"; my $socket = $connection->socket; $watcher{$connection} = Event->io(fd => $socket, cb => [ $connection, '_recv_message' ], poll => 're', desc => 'recv_watcher', repeat => 1); } sub DisconnectHandler { my $connection = shift; print "DISCONNECT\n"; $watcher{$connection}->cancel; } $session->connect_handler(\&ConnectHandler); $session->disconnect_handler(\&DisconnectHandler); $session->Login; Event::loop(); That's what it outputs: Dispatch Server connecting to: messenger.hotmail.com:1863 Dispatch Server connected CONNECT Dispatch Server >>>VER 1 MSNP2 CVR0 --> VER 1 MSNP2 CVR0 Dispatch Server >>>USR 2 MD5 I [email protected] --> USR 2 MD5 I [email protected] Dispatch Server <<<VER 1 CVR0 <-- VER 1 CVR0 And that's all, here it hangs. The handler on login is not being triggered. What am I doing wrong?

    Read the article

  • Setting up Apache with multiple virtual host when using Plone 4.1

    - by Shaun Owens
    I have a Plone server running on CentOS, I have multiple instances of Plone running 4.0 and 4.1, I also have multiple sites. I am new to linux and haveing problems getting Apache to work with multiple virtuale hosts. The first host listed works just fine but the second host does not. I get the following error message when I start HTTPD: Starting httpd: [Mon Nov 07 14:38:31 2011] [warn] VirtualHost ordevel3.ucdavis.edu:80 overlaps with VirtualHost ordevel4.ucdavis.edu:80, the first has precedence, perhaps you need a NameVirtualHost directive. What am I missing to get the virtual hosts to work correctly? Below in my syntax in httpd.conf. <VirtualHost ordevel3.abc.edu:80> ServerAlias ordevel3.abc.edu ServerAdmin ortech@abc.edu ServerSignature On <IfModule mod_rewrite.c> RewriteEngine On # serving icons from apache 2 server RewriteRule ^/icons/ - [L] RewriteRule ^/(.*) \ http://localhost:8080/VirtualHostBase/http/%{SERVER_NAME}:80/itsdevel3/VirtualHostRoot/$1 [L,P] </IfModule> <IfModule mod_proxy.c> ProxyVia On # prevent the webserver from beeing used as proxy <LocationMatch "^[^/]"> Deny from all </LocationMatch> </IfModule> </VirtualHost> <VirtualHost ordevel4.abc.edu:80> ServerAlias ordevel4.abc.edu ServerAdmin ortech@abc.edu ServerSignature On <IfModule mod_rewrite.c> RewriteEngine On # serving icons from apache 2 server RewriteRule ^/icons/ - [L] RewriteRule ^/(.*) \ http://localhost:8180/VirtualHostBase/http/%{SERVER_NAME}:80/ITS/VirtualHostRoot/$1 [L,P] </IfModule> <IfModule mod_proxy.c> ProxyVia On # prevent the webserver from beeing used as proxy <LocationMatch "^[^/]"> Deny from all </LocationMatch> </IfModule> </VirtualHost>

    Read the article

  • Excel data representation: show me all people who did not pass the exam

    - by dreftymac
    Background I have an excel spreadsheet with the results of a pass/no-pass exam. Students are allowed to take the exam as often as they want until they either pass, or give up trying. student ;; result ;; date sally@fiz.edu ;; no-pass ;; 2000-06-07 bravo@fiz.edu ;; pass ;; 2000-06-07 charlie@fiz.edu ;; pass ;; 2000-06-07 delta@fiz.edu ;; no-pass ;; 2000-06-07 alpha@fiz.edu ;; pass ;; 2000-06-07 sally@fiz.edu ;; pass ;; 2000-06-08 delta@fiz.edu ;; no-pass ;; 2000-06-08 Question Using a pivot-table or something else, how can I get excel to show me a clean report or representation of this data on another sheet that answers the question: Who are all the people who took the exam, but never got a passing grade? In the above example it would just show me delta@fiz.edu ;; no-pass ;; with all the dates that delta took the exam. I know excel is not a database nor a reporting tool per-se, but it would be great if I could get it to do this.

    Read the article

  • MSBuild script fails but produces no errors

    - by Kate
    I have a MSBuild script that I am executing through TeamCity. One of the tasks that is runs is from Xheo DeploxLX CodeVeil which obfuscates some DLLs. The task I am using is called VeilProject. I have run the CodeVeil Project through the interface manually and it works correctly, so I think I can safely assume that the actual obfuscate process is ok. This task used to take around 40 minutes and the rest of the MSBuild file executed perfectly and finished without errors. For some reason this task is now taking 1hr 20 minutes or so to execute. Once the VeilProject task is finished the output from the task says it completely successfully, however the MSBuild script fails at this point. I have a task directly after the VeilProject task and it does not get outputted. Using diagnostic output from MSBUild I can see the following: My questions are: Would it be possible that the MSBuild script has timed out? Once the task has completed it is after a certain timeout period so it just fails? Why would the build fail with no errors and no warnings? [05:39:06]: [Target "Obfuscate"] Finished. [05:39:06]: [Target "Obfuscate"] Saving exception map [05:49:21]: [Target "Obfuscate"] Ended at 11/05/2010 05:49:21, ~1 hour, 48 minutes, 6 seconds [05:49:22]: [Target "Obfuscate"] Done. [05:49:51]: MSBuild output: Ended at 11/05/2010 05:49:21, ~1 hour, 48 minutes, 6 seconds (TaskId:8) Done. (TaskId:8) Done executing task "VeilProject" -- FAILED. (TaskId:8) Done building target "Obfuscate" in project "AMK_Release.proj.teamcity.patch.tcprojx" -- FAILED.: (TargetId:12) Done Building Project "C:\Builds\Scripts\AMK_Release.proj.teamcity.patch.tcprojx" (All target(s)) -- FAILED. Project Performance Summary: 6535484 ms C:\Builds\Scripts\AMK_Release.proj.teamcity.patch.tcprojx 1 calls 6535484 ms All 1 calls Target Performance Summary: 156 ms PreClean 1 calls 266 ms SetBuildVersionNumber 1 calls 2406 ms CopyFiles 1 calls 6532391 ms Obfuscate 1 calls Task Performance Summary: 16 ms MakeDir 2 calls 31 ms TeamCitySetBuildNumber 1 calls 31 ms Message 1 calls 62 ms RemoveDir 2 calls 234 ms GetAssemblyIdentity 1 calls 2406 ms Copy 1 calls 6528047 ms VeilProject 1 calls Build FAILED. 0 Warning(s) 0 Error(s) Time Elapsed 01:48:57.46 [05:49:52]: Process exit code: 1 [05:49:55]: Build finished

    Read the article

  • Using MS Standalone profiler in VS2008 Professional

    - by fishdump
    I am trying to profile my .NET dll while running it from VS unit testing tools but I am having problems. I am using the standalone command-line profiler as VS2008 Professional does not come with an inbuilt profiler. I have an open CMD window and have run the following commands (I instrumented it earlier which is why vsinstr gave the warning that it did): C:\...\BusinessRules\obj\Debug>vsperfclrenv /samplegclife /tracegclife /globalsamplegclife /globaltracegclife Enabling VSPerf Sampling Attach Profiling. Allows to 'attaching' to managed applications. Current Profiling Environment variables are: COR_ENABLE_PROFILING=1 COR_PROFILER={0a56a683-003a-41a1-a0ac-0f94c4913c48} COR_LINE_PROFILING=1 COR_GC_PROFILING=2 C:\...\BusinessRules\obj\Debug>vsinstr BusinessRules.dll Microsoft (R) VSInstr Post-Link Instrumentation 9.0.30729 x86 Copyright (C) Microsoft Corp. All rights reserved. Error VSP1018 : VSInstr does not support processing binaries that are already instrumented. C:\...\BusinessRules\obj\Debug>vsperfcmd /start:trace /output:foo.vsp Microsoft (R) VSPerf Command Version 9.0.30729 x86 Copyright (C) Microsoft Corp. All rights reserved. C:\...\BusinessRules\obj\Debug> I then ran the unit tests that exercised the instrumented code. When the unit tests were complete, I did... C:\...\BusinessRules\obj\Debug>vsperfcmd /shutdown Microsoft (R) VSPerf Command Version 9.0.30729 x86 Copyright (C) Microsoft Corp. All rights reserved. Waiting for process 4836 ( C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\vstesthost.exe) to shutdown... It was clearly waiting for VS2008 to close so I closed it... Shutting down the Profile Monitor ------------------------------------------------------------ C:\...\BusinessRules\obj\Debug> All looking good, there was a 3.2mb foo.vsp file in the directory. I next did... C:\...\BusinessRules\obj\Debug>vsperfreport foo.vsp /summary:all Microsoft (R) VSPerf Report Generator, Version 9.0.0.0 Copyright (C) Microsoft Corporation. All rights reserved. VSP2340: Environment variables were not properly set during profiling run and managed symbols may not resolve. Please use vsperfclrenv before profiling. File opened Successfully opened the file. A report file, foo_Header.csv, has been generated. A report file, foo_MarksSummary.csv, has been generated. A report file, foo_ProcessSummary.csv, has been generated. A report file, foo_ThreadSummary.csv, has been generated. Analysis completed A report file, foo_FunctionSummary.csv, has been generated. A report file, foo_CallerCalleeSummary.csv, has been generated. A report file, foo_CallTreeSummary.csv, has been generated. A report file, foo_ModuleSummary.csv, has been generated. C:\...\BusinessRules\obj\Debug> Notice the warning about environment variables and using vsperfclrenv? But I had run it! Maybe I used the wrong switches? I don't know. Anyway, loading the csv files into Excel or using the perfconsole tool gives loads of useful info with useless symbol names: *** Loading commands from: C:\temp\PerfConsole\bin\commands\timebytype.dll *** Adding command: timebytype *** Loading commands from: C:\temp\PerfConsole\bin\commands\partition.dll *** Adding command: partition Welcome to PerfConsole 1.0 (for bugs please email: [email protected]), for help type: ?, for a quickstart type: ?? > load foo.vsp *** Couldn't match to either expected sampled or instrumented profile schema, defaulting to sampled *** Couldn't match to either expected sampled or instrumented profile schema, defaulting to sampled *** Profile loaded from 'foo.vsp' into @foo > > functions @foo >>>>> Function Name Exclusive Inclusive Function Name Module Name -------------------- -------------------- -------------- --------------- 900,798,600,000.00 % 900,798,600,000.00 % 0x0600003F 20397910 14,968,500,000.00 % 44,691,540,000.00 % 0x06000040 14736385 8,101,253,000.00 % 14,836,330,000.00 % 0x06000041 5491345 3,216,315,000.00 % 6,876,929,000.00 % 0x06000042 3924533 <snip> 71,449,430.00 % 71,449,430.00 % 0x0A000074 42572 52,914,200.00 % 52,914,200.00 % 0x0A000073 0 14,791.00 % 13,006,010.00 % 0x0A00007B 0 199,177.00 % 6,082,932.00 % 0x2B000001 5350072 2,420,116.00 % 2,420,116.00 % 0x0A00008A 0 836.00 % 451,888.00 % 0x0A000045 0 9,616.00 % 399,436.00 % 0x0A000039 0 18,202.00 % 298,223.00 % 0x06000046 1479900 I am so close to being able to find the bottlenecks, if only it will give me the function and module names instead of hex numbers! What am I doing wrong? --- Alistair.

    Read the article

  • MS cash drawer with epson TM-IV88 Status API

    - by Xience
    Does anyone know how to monitor cash drawer's open/close state using Advanced Printer Driver's Status API for Epson TM-88IV thermal printer. I wish i could use OPOS for ADK .Net, but haven't had luck setting it up on windows 7. Does anyone know how to be a part of epson developer network. I have gone through the information available at www.epson-pos.com but there is no information available on POS/ESC codes. Please help...........

    Read the article

  • Codeigniter benchmarking, where are these ms coming from?

    - by ropstah
    I'm in the process of benchmarking my website. class Home extends Controller { function Home() { parent::Controller(); $this->benchmark->mark('Constructor_start'); $this->output->enable_profiler(TRUE); $this->load->library ('MasterPage'); $this->benchmark->mark('Constructor_end'); } function index() { $this->benchmark->mark('Index_start'); $this->masterpage->setMasterPage('master/home'); $this->masterpage->addContent('home/index', 'page'); $this->masterpage->show(); $this->benchmark->mark('Index_start'); } } These are the results: Loading Time Base Classes: 0.0076 Constructor: 0.0007 Index: 0.0440 Controller Execution Time ( Home/ Index ): 0.4467 Total Execution Time: 0.4545` I understand the following: Loading Time Base Classes (0.0076) Constructor (0.0007) Index (0.0440) But where is the rest of the time coming from?

    Read the article

  • MS SQL 2008, join or no join?

    - by Patrick
    Just a small question regarding joins. I have a table with around 30 fields and i was thinking about making a second table to store 10 of those fields. Then i would just join them in with the main data. The 10 fields that i was planning to store in a second table does not get queried directly, it's just some settings for the data in the first table. Something like: Table 1 Id Data1 Data2 Data3 etc ... Table 2 Id (same id as table one) Settings1 Settings2 Settings3 Is this a bad solution? Should i just use 1 table? How much performance inpact does it have? All entries in table 1 would also then have an entry in table 2. Small update is in order. Most of the Data fields are of the type varchar and 2 of them are of the type text. How is indexing treated? My plan is to index 2 data fields, email (varchar 50) and author (varchar 20). And yes, all records in Table 1 will have a record in Table 2. Most of the settings fields are of the bit type, around 80%. The rest is a mix between int and varchar. The varchars can be null.

    Read the article

  • Web application creation in IIS7 via MS.Web.Admin

    - by Jon Ownbey
    I am attempting to create seperate workflow instances as applications in IIS7 using the Microsoft.Web.Administration dll. When it attempts to add the Application to the Site ApplicationsCollection I get a COM error: "Invalid application path\r\n" using (ServerManager manager = new ServerManager()) { var site = manager.Sites.Where(x => x.Name == Properties.Settings.Default.WorkflowWebsiteName).Single(); StringBuilder stringBuilder = new StringBuilder() .Append(m_workflowDefinition.AccountId) .Append("/") .Append(m_workflowDefinition.WorkflowDefinitionId) .Append("/") .Append(m_workflowDefinition.Version) .Append("/"); string virtualPath = stringBuilder.ToString(); string physicalPath = Properties.Settings.Default.ApplicationPoolString + virtualPath.Replace("/", "\\"); if (!Directory.Exists(physicalPath)) Directory.CreateDirectory(physicalPath); //Create the workflow service definition file using (StreamWriter writer = new StreamWriter(Path.Combine(physicalPath, m_workflowDefinition.WorkflowName + WORKFLOW_FILE_EXTENSION))) { writer.Write(m_workflowDefinition.Definition); } //Copy dependencies string dependencyPath = m_workflowDefinition.DependenciesPath; CopyAll(new DirectoryInfo(dependencyPath), new DirectoryInfo(physicalPath)); //Create a new IIS application for the workflow var apps = site.Applications.Where(x => x.Path == virtualPath); if (apps.Count() > 0) { site.Applications.Remove(apps.Single()); } Application app = site.Applications.Add(virtualPath, physicalPath); app.ApplicationPoolName = "Workflow AppPool"; app.EnabledProtocols = PROTOCOLS; manager.CommitChanges(); } The value assigned to virtualPath is like: "something/something/something" and for physicalPath it is "c:\inetpub\wwwroot\Workflow\something\something\something". Any ideas? Any help is greatly appreciated.

    Read the article

  • Rails - MS-SQL Server problems (unixODBC, FreeTDS) on Mac 10.6

    - by TMB
    Followed the instructions on the Rails wiki and have had success connecting to SQL Server 2000 with TSQL -- both with DSN-less and DNS connections. I'm running Mac OS X 10.6.3. Wiki instructions here. Installed ruby-odbc, dbi (0.4.0), dbd-odbc (2.4.5), activerecord-sqlserver-adapter (2.3.5). In my database.yml (Rails 2.3.6): development: adapter: sqlserver mode: ODBC dsn: 'DRIVER=/usr/local/lib/libtdsodbc.so;TDS_Version=8.0;SERVER=mssql01.discountasp.net;DATABASE=DB_164368_dmusd;Port=1433;uid=DB_164368_dmusd_user;pwd=Schools77;' This yields the following error: ODBC::Error: S1090 (0) [unixODBC][Driver Manager]Invalid string or buffer length When I attempt to use a DSN connection, I get the following error: ODBC::Error: IM002 (0) [unixODBC][Driver Manager]Data source name not found, and no default driver specified I have in fact verified that the FreeTDS driver (libtdsodbc.so) is installed and the path correct. Can anyone spot the error of my ways? Thanks in advance.

    Read the article

  • sp_send_dbmail - MS SQL 2008

    - by Nev_Rahd
    sp_send_dbmail works fine on one server as below: EXEC msdb.dbo.sp_send_dbmail @profile_name = 'xxxx Mail Profile', @recipients = 'xxx.com', @body = 'xxxxxxxx', @subject = 'xxx - Please do not reply to this email'; But on other server where got same Mail profile setup throws an error saying it excepts value for parameters @copy_recipients, @blind_copy_recipients (though these are optional) Where do I need to check ? any help please Thanks

    Read the article

  • Difference between $().click(fn), $().bind(‘click’,fn), $().live('click',fn) and $().delegate('td',

    - by I Like PHP
    Hello All, i know there are a lot of question similar to this, but i want to know clear difference between all of these jQuery function together on this page with example , so that it will be very helpful for me to understand the mechanism of all of these function. i have also read the reference on jQuery main site, but there is no comparison. Please do not refer any link if there is a part of question belong to that. please describe how all four function exactly works in different manner . and which should be prefer in which situation. note: if there are any other function with same functionality/mechanism , then please share. Thanks a lot

    Read the article

  • How to Transform a user's search string into a MS SQL Full-Text Search Phrase

    - by Atomiton
    I've search for answers for this and I can't seem to find an answer to what should be somewhat simple. This is related to another question I asked, but it's different. What's the best way to take a user's search phrase and throw it into a CONTAINSTABLE(table, column, @phrase, topN ) phrase? Say, for example the user inputs: Books by "Dr. Seuss" What's the best way to turn that into something that will return results in my ContainsTAble() phrase? I was previously parsing the search phrase and writing something like ISABOUT("Books" WEIGHT(1.0), "by" WEIGHT(0.9), "Dr. Seuss" WEIGHT(0.8)) as my @phrase but ISABOUT seems to be returning odd results... especially when one word searches are entered. Any Ideas?

    Read the article

  • ms sql use like statement result in if statement

    - by Asha
    declare @d varchar set @d = 'No filter' if (@d like 'No filter') BEGIN select 'matched' end else begin select 'not matched' end the result of above is always not matched can anybody tell me why and how can I use the like or '=' result in my stored procedure. thanks

    Read the article

  • vb.net A first chance exception of type 'System.Runtime.InteropServices.COMException' occurred in ms

    - by prasoon99
    I just installed Visual Basic 2010 Express. I created a simple console application: Module Module1 Sub Main() Dim i As Integer i = 0 End Sub End Module I'm getting the following error SIX times before I get to line "i = 0": A first chance exception of type 'System.Runtime.InteropServices.COMException' occurred in mscorlib.dll Can anyone help? Why is this happening? Is there something wrong with my configuration? -Prasoon

    Read the article

< Previous Page | 77 78 79 80 81 82 83 84 85 86 87 88  | Next Page >