Search Results

Search found 15651 results on 627 pages for 'setup'.

Page 328/627 | < Previous Page | 324 325 326 327 328 329 330 331 332 333 334 335  | Next Page >

  • Routing a url to fetch content from another site

    - by Abhishek
    Environment: IIS 7. I have a default site www.domain.com and its folder is C:Inetpub/wwwroot/domain There is subdomain www.subdomain.domain.com and its folder is C:Inetpub/wwwroot/domain/subdomain. Now I have setup a new website at an external server. I cannot put the content on the above server due to some reasons. I need the URL www.subdomain.domain.com/blog fetch content from this external server while the URL should remain the same. How could this be achieved in IIS 7?

    Read the article

  • Windows 7 admin denied access to taskmgr, system32 dir

    - by DotNet Zebra
    I have a Windows 7 (32-bit) box with 2 users, both admins (my wife and I are both developers). My admin account was created during Windows setup, hers was created later. Both accounts are in the same groups, yet we have VERY different permissions. In the beta and RC, both accounts worked identically (RC to RTM was a fresh install on this box, not an upgrade). I have a C:\bin folder with the sysinternals utilities and a bunch of other stuff. Running anything in there or in system32 just works on my account, on hers I get access denied errors (cannot access file or path). If I right click and try Run As Administrator, I still get the same thing!!!

    Read the article

  • Single Exchange 2007 server - two AD domains

    - by TheCleaner
    CURRENT: single domain, single Exchange 2007 NEW: two domains, single Exchange 2007 Can this be done? Details: Current setup is a single W2k3 domain with a single Exchange 2007 server. We are merging with another company that currently hosts their email with their ISP via POP3. We'd like to start hosting their email on our Exchange server using their existing domain SMTP addresses. They don't have an AD domain at all at the moment. Recommendations? Can I do this with a trust between the 2 domains? Requirements: They can't have multiple SMTP addresses on both domains...such as I've seen with articles pointing to "hosting multiple domains". I want companyA to have the same account settings they've always had...companyB to have the same SMTP address they've had and not an additional one on the current companyA Exchange domain. They should be able to collaborate (calendar, contacts, GALs) but should still be distinguishable based on which company they "work for". Please help...thanks!

    Read the article

  • Swapping 3Ware Raid Card - Best Procedure

    - by Brian Lee Jackson
    We have a server running an old firmware version and driver (just took this position) on the 3Ware 9650SE Raid Controller... We have been having issues with the server and seemed to have narrowed it down to the Raid Card... I will be replacing the Raid card with the same model of 3Ware 9650SE, however, the card ordered most likely will have newer firmware on it. I managed to backup all the data to a very large drive. My plan is to update the firmware/driver on the current setup (which is still booting) and verify that everything works.. Then throw in the new Raid Card, check the firmware version (not letting it post). And update to newest firmware if needed via Java management utility on the card? Is this the best route? Thanks!

    Read the article

  • Leveraging .Net 4.0 Framework Tools For Encrypting Web Configuration Sections

    - by Sam Abraham
    I would like to share a few points with regards to encrypting web configuration sections in .Net 4.0. This information is also applicable to .Net 3.5 and 2.0. Two methods can work perfectly for encrypting connection strings in a Web project configuration file:   1-Do It All Yourself! In this approach, helper functions for encrypting/decrypting configuration file content are implemented. Program would explicitly retrieve appropriate content from configuration file then decrypt it appropriately.  Disadvantages of this implementation would be the added overhead for maintaining the encryption/decryption code as well the burden of always ensuring sections are appropriately decrypted before use and encrypted appropriately whenever edited.   2- Leverage the .Net 4.0 Framework (The Way to go!) Fortunately, all needed tools for protecting configuration files are built-in to the .Net 2.0/3.5/4.0 versions with very little setup needed. To encrypt connection strings, one can use the ASP.Net IIS Registration Tool (Aspnet_regiis.exe). Note that a 64-bit version of the tool also exists under the Framework64 folder for 64-bit systems. The command we need to encrypt our web.config file connection strings is simply the following:   Aspnet_regiis –pe “connectionstrings” –app “/sampleApplication” –prov “RsaProtectedConfigurationProvider”   To later decrypt this configuration section:   Aspnet_regiis –pd “connectionstrings” –app “/SampleApplication”   The following is a brief description of the command line options used in the example above. Aspnet_regiis supports many more options which you can read about in the links provided for reference below.   Option Description -pe  Section name to encrypt -pd  Section name to decrypt -app  Web application name -prov  Encryption/Decryption provider   ASP.Net automatically decrypts the content of the Web.Config file at runtime so no programming changes are needed.   Another tool, aspnet_setreg.exe is to be used if certain configuration file sections pertinent to the .Net runtime are to be encrypted. For more information on when and how to use aspnet_setreg, please refer to the references below.   Hope this helps!   Some great references concerning the topic:   http://msdn.microsoft.com/en-us/library/ff650037.aspx http://msdn.microsoft.com/en-us/library/zhhddkxy.aspx http://msdn.microsoft.com/en-us/library/dtkwfdky.aspx http://msdn.microsoft.com/en-us/library/68ze1hb2.aspx

    Read the article

  • MySQL Database synchronizing with local and remote with c#

    - by Neo
    I've posted this here as its more of a mysql questions than c#, I have written some software that runs a local instance of mysql when it first starts, now once mysql is up I would like to synchronize the data between the remote database table and the local database table that the software runs (it shouldn't sync any other databases / tables as there are a lot). I have replication setup to synchronize the entire database to another server which works unless the server goes down then it never comes back up, so based on that I don't think replication will work as when the software is closed it also closes MySQL. So what would be the best method of synchronizing the remote and local databases?

    Read the article

  • Strange performance from RAID5 using WD RE4 disks

    - by Howard
    I've noticed a bit of a performance issue with some WD RE4 drives I'm using under AMD's hardware RAID solution. First a bit of background: Environment: Windows 7 home premium x64 HDD's: 3x 1TB WD Raid Edition 4 in a RAID 5 setup with 128 kbyte stripe (2TB usable space) Testing Tool: HD Tune, process set to "High Priority" Processor: AMD Phenom II x6 1100T Ram: 16GB DDR3/1600mhz Motherboard: MSI 970A-G45 The image below pretty much depicts the issue I'm having. Every test has the same thing, a period of similar length where the performance drops to a few megabytes a second. This can't be a TLER issue as the purpose of RE4's is to work around that. Any help would be greatly appreciated.

    Read the article

  • How to back up a network volume to my Time Capsule?

    - by Mike
    I have a Time Capsule that I'm using for my backups. I have a network volume (coincidentally on the same time capsule) that I'd like to back up as well. How can I tell Time Machine to back up network volumes in addition to my main laptop hard drive? PS: yes, I know this setup isn't ideal. It'll incur 2x network overhead when backing up the network volume, plus my data won't be safe in the event of a drive failure since both copies will be on the same disk. However, it will give me some small amount of safety in the event I accidentally delete files on the network volume, among other things.

    Read the article

  • Netboot Intel Macs without BSDP

    - by notpeter
    I have a netboot setup with DeployStudio that works great in my lab, but doesn't work on our main network. After some digging, I believe it's because our network admins are filtering BSDP (Boot Service Discovery Protocol) on our subnet at the switch level. Is it possible to hard code which server my clients (early 2007 iMac Core2Duos) should boot from without relying on BSDP? Perhaps relevant details: I do not have control over switch configs or DHCP settings. Client and server are running 10.6 Snow Leopard. The clients see the netboot server advertising itself in the 'Startup Disk' system preferences pane, but when I go to netboot it just leaves me with a flashing globe.

    Read the article

  • What requirements does an IT department work space need?

    - by Rob
    Hello all, i need to provide a list of workspace requirements to the IT director for my network operations team. So far I got Secure workspace - so nothing gets stolen and people cant come up to us asking for support (they need a ticket from the helpdesk) Quite area - so that we can work and not be disturbed by the loud project managers who play soccer in the office sometimes. A large table or desk where we can setup and or config systems and servers if needed. What else do we need? Thanks in advance.

    Read the article

  • Purpose of LAN Domain?

    - by Leonard Thieu
    What is the purpose of creating a domain name for your LAN? I'm using DD-WRT on my router and assigned local.moofz.com as the LAN domain. I setup Apache HTTP servers on two of the computers on my LAN to test it out. I could reach them on oneil.local.moofz.com and vala.local.moofz.com, but I found out that I could also reach them via their hostnames oneil and vala. If I can reach them through their host names, then what would be the purpose of having a domain name for my LAN?

    Read the article

  • What You Said: Cutting the Cable Cord

    - by Jason Fitzpatrick
    Earlier this week we asked you if you’d cut the cable and switched to alternate media sources to get your movie and TV fix. You responded and we’re back with a What You Said roundup. One of the recurrent themes in reader comments and one, we must admit, we didn’t expect to see with such prevalence, was the number of people who had ditched cable for over-the-air HD broadcasts. Fantasm writes: I have a triple HD antenna array, mounted on an old tv tower, each antenna facing out from a different side of the triangular tower. On tope of the tower are two 20+ year old antennas… I’m 60 miles from toronto and get 35 channels, most in brilliant HD… Anything else, comes from the Internet… Never want cable or sat again… Grant uses a combination of streaming services and, like Fantasm, manages to pull in HD content with a nice antenna setup: We use Netflix, Hulu Plus, Amazon Prime, Crackle, and others on a Roku as well as OTA on a Tivo Premier. The Tivo is simply the best DVR interface I have ever used. The Tivo Netflix application, though, is terrible, and it does not support Amazon Prime. Having both boxes makes it easy to use all of the services. 6 Ways Windows 8 Is More Secure Than Windows 7 HTG Explains: Why It’s Good That Your Computer’s RAM Is Full 10 Awesome Improvements For Desktop Users in Windows 8

    Read the article

  • SSIS Send Mail Task and ForceExecutionValue Error

    - by Kevin Shyr
    I tried to use the "ForcedExecutionValue" on several Send Mail Tasks and log the execution into a ExecValueVariable so that at the end of the package I can log into a table to say whether the data check is successful or not (by determine whether an email was sent out) I set up a Boolean variable that is accessible at the package level, then set up my Send Mail Task as the screenshot below with Boolean as my ForcedExecutionValueType.  When I run the package, I got the error described below. Just to make sure this is not another issue of SSIS having with Boolean type ( you also can't set variable value from xp_cmdshell of type Boolean), I used variables of types String, Int32, DateTime with the corresponding ForcedExecutionValueType.  The only way to get around this error, was to set my variable to type Object, but then when you try to get the value out later, the Object is null. I didn't spend enough time on this to see whether it's really a bug in SSIS or not, or is this just how Send Mail Task works.  Just want to log the error and will circle back on this later to narrow down the issue some more.  In the meantime, please share if you have run into the same problem.  The current workaround is to attach a script task at the end. Also, need to note 2 existing limitation: Data check needs to be done serially because every check needs to be inner join to a master table.  The master table has all the data in a single XML column and hence need to be retrieved with XQuery (a fundamental design flaw that needs to be changed) The next iteration will be to change this design into a FOR loop and pull out the checking query from a table somewhere with all the info needed for email task, but is being put to the back of the priority. Error Message: Error: 0xC001F009 at CountCheckBetweenODSAndCleanSchema: The type of the value being assigned to variable "User::WasErrorEmailEverSent" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object. Error: 0xC0019001 at Send Mail Task on count mismatch: The wrapper was unable to set the value of the variable specified in the ExecutionValueVariable property.   Screenshot of my Send Mail Task setup:

    Read the article

  • Squid/Kerberos authentication with only Linux

    - by user28362
    Hi, I would like to know if it possible to let a Windows Xp machine authenticate to Squid (Linux) using Kerberos without the need of an Active Directory domain. I only want to create a Kerberos ticket on the client side, which should give the client access to squid (using I.E.). I only found tutorials about configuring A.D./Squid, not an environment with only Linux servers. Thanks Update: The kerberos setup is correctly done, the proxy and client can get tickets. As for the browser (FF/IE), I get: ERROR Cache Access Denied While trying to retrieve the URL: http://www.google.com/ The following error was encountered: * Cache Access Denied. Sorry, you are not currently allowed to request: http://www.google.com/ from this cache until you have authenticated yourself. In kerberos, I get: squid_kerb_auth: Got 'YR ElRNTVMTUABBAABAB4IIogAAAAAAAAAAAAAAAAAAAAAFASgDAAAADw==' from squid (length: 59). squid_kerb_auth: parseNegTokenInit failed with rc=101 squid_kerb_auth: received type 1 NTLM token This message is strange, as I didn't configure NTLM. It looks like the browser uses the wrong authentication methode.

    Read the article

  • Internet connectivity issues with one router but work ok with other router

    - by user825904
    I have one Tplink ADSL ROUTER and when i enter username and password on setup page then everything works fine. Now i have one more router Netgear router then when i enter same username and password then interworks ok for some 50% websites but for other 50% websites the page is not loaded and it hangs there. In the sats bar it says website found , waiting for reply and it hnags there and no site is displayed. I wonder which setting is different on these two routers. The Tplink router i have bought is from local shop but netgear router is from different country. Can that make some difference

    Read the article

  • Restricting SSRS subscriptions to shared schedules only

    - by Matt Frear
    Hi all I'm reasonably new to SQL Server Reporting Services and Report Manager, and completely new to SSRS's Subscriptions. We're running SSRS 2008. Out of the box it seems that a user with the Browser role can create a Subscription to a report and schedule it to run at any time they choose. As an admin I have setup a schedule called "Overnight reports" and have it run every night from 1am. I would like it so that when a regular user creates their Subscription they can only use one of my shared schedules so that their subscription will only run overnight. Is this possible? Thanks -Matt

    Read the article

  • Mouse Icon Distorted on Secondary Display

    - by Nathan Taylor
    I have a strange issue with a dual monitor, extended desktop setup where the mouse is always fine on the primary monitor, but sometimes when I move to the secondary display the icon becomes garbled and distorted (sometimes it just looks like a vertical line, instead of a pointer). If I move the mouse back and forth rapidly between primary and secondary displays the level of "garbledness" of the icon will change and sometimes go away completely. If I switch the display settings and set it to "Duplicate Monitor 1" then I end up with a garbled icon on the primary display and an accurate one on the secondary. Very annoying. Computer is Windows 7 Ultimate with an HD8750 and the newest video drivers. Monitors are two Dell 24" displays connected via DVI cables. I have also tried VGA cables.

    Read the article

  • Meet the New Windows Azure

    - by ScottGu
    Today we are releasing a major set of improvements to Windows Azure.  Below is a short-summary of just a few of them: New Admin Portal and Command Line Tools Today’s release comes with a new Windows Azure portal that will enable you to manage all features and services offered on Windows Azure in a seamless, integrated way.  It is very fast and fluid, supports filtering and sorting (making it much easier to use for large deployments), works on all browsers, and offers a lot of great new features – including built-in VM, Web site, Storage, and Cloud Service monitoring support. The new portal is built on top of a REST-based management API within Windows Azure – and everything you can do through the portal can also be programmed directly against this Web API. We are also today releasing command-line tools (which like the portal call the REST Management APIs) to make it even easier to script and automate your administration tasks.  We are offering both a Powershell (for Windows) and Bash (for Mac and Linux) set of tools to download.  Like our SDKs, the code for these tools is hosted on GitHub under an Apache 2 license. Virtual Machines Windows Azure now supports the ability to deploy and run durable VMs in the cloud.  You can easily create these VMs using a new Image Gallery built-into the new Windows Azure Portal, or alternatively upload and run your own custom-built VHD images. Virtual Machines are durable (meaning anything you install within them persists across reboots) and you can use any OS with them.  Our built-in image gallery includes both Windows Server images (including the new Windows Server 2012 RC) as well as Linux images (including Ubuntu, CentOS, and SUSE distributions).  Once you create a VM instance you can easily Terminal Server or SSH into it in order to configure and customize the VM however you want (and optionally capture your own image snapshot of it to use when creating new VM instances).  This provides you with the flexibility to run pretty much any workload within Windows Azure.   The new Windows Azure Portal provides a rich set of management features for Virtual Machines – including the ability to monitor and track resource utilization within them.  Our new Virtual Machine support also enables the ability to easily attach multiple data-disks to VMs (which you can then mount and format as drives).  You can optionally enable geo-replication support on these – which will cause Windows Azure to continuously replicate your storage to a secondary data-center at least 400 miles away from your primary data-center as a backup. We use the same VHD format that is supported with Windows virtualization today (and which we’ve released as an open spec), which enables you to easily migrate existing workloads you might already have virtualized into Windows Azure.  We also make it easy to download VHDs from Windows Azure, which also provides the flexibility to easily migrate cloud-based VM workloads to an on-premise environment.  All you need to do is download the VHD file and boot it up locally, no import/export steps required. Web Sites Windows Azure now supports the ability to quickly and easily deploy ASP.NET, Node.js and PHP web-sites to a highly scalable cloud environment that allows you to start small (and for free) and then scale up as your traffic grows.  You can create a new web site in Azure and have it ready to deploy to in under 10 seconds: The new Windows Azure Portal provides built-in administration support for Web sites – including the ability to monitor and track resource utilization in real-time: You can deploy to web-sites in seconds using FTP, Git, TFS and Web Deploy.  We are also releasing tooling updates today for both Visual Studio and Web Matrix that enable developers to seamlessly deploy ASP.NET applications to this new offering.  The VS and Web Matrix publishing support includes the ability to deploy SQL databases as part of web site deployment – as well as the ability to incrementally update database schema with a later deployment. You can integrate web application publishing with source control by selecting the “Set up TFS publishing” or “Set up Git publishing” links on a web-site’s dashboard: Doing do will enable integration with our new TFS online service (which enables a full TFS workflow – including elastic build and testing support), or create a Git repository that you can reference as a remote and push deployments to.  Once you push a deployment using TFS or Git, the deployments tab will keep track of the deployments you make, and enable you to select an older (or newer) deployment and quickly redeploy your site to that snapshot of the code.  This provides a very powerful DevOps workflow experience.   Windows Azure now allows you to deploy up to 10 web-sites into a free, shared/multi-tenant hosting environment (where a site you deploy will be one of multiple sites running on a shared set of server resources).  This provides an easy way to get started on projects at no cost. You can then optionally upgrade your sites to run in a “reserved mode” that isolates them so that you are the only customer within a virtual machine: And you can elastically scale the amount of resources your sites use – allowing you to increase your reserved instance capacity as your traffic scales: Windows Azure automatically handles load balancing traffic across VM instances, and you get the same, super fast, deployment options (FTP, Git, TFS and Web Deploy) regardless of how many reserved instances you use. With Windows Azure you pay for compute capacity on a per-hour basis – which allows you to scale up and down your resources to match only what you need. Cloud Services and Distributed Caching Windows Azure also supports the ability to build cloud services that support rich multi-tier architectures, automated application management, and scale to extremely large deployments.  Previously we referred to this capability as “hosted services” – with this week’s release we are now referring to this capability as “cloud services”.  We are also enabling a bunch of new features with them. Distributed Cache One of the really cool new features being enabled with cloud services is a new distributed cache capability that enables you to use and setup a low-latency, in-memory distributed cache within your applications.  This cache is isolated for use just by your applications, and does not have any throttling limits. This cache can dynamically grow and shrink elastically (without you have to redeploy your app or make code changes), and supports the full richness of the AppFabric Cache Server API (including regions, high availability, notifications, local cache and more).  In addition to supporting the AppFabric Cache Server API, it also now supports the Memcached protocol – allowing you to point code written against Memcached at it (no code changes required). The new distributed cache can be setup to run in one of two ways: 1) Using a co-located approach.  In this option you allocate a percentage of memory in your existing web and worker roles to be used by the cache, and then the cache joins the memory into one large distributed cache.  Any data put into the cache by one role instance can be accessed by other role instances in your application – regardless of whether the cached data is stored on it or another role.  The big benefit with the “co-located” option is that it is free (you don’t have to pay anything to enable it) and it allows you to use what might have been otherwise unused memory within your application VMs. 2) Alternatively, you can add “cache worker roles” to your cloud service that are used solely for caching.  These will also be joined into one large distributed cache ring that other roles within your application can access.  You can use these roles to cache 10s or 100s of GBs of data in-memory very effectively – and the cache can be elastically increased or decreased at runtime within your application: New SDKs and Tooling Support We have updated all of the Windows Azure SDKs with today’s release to include new features and capabilities.  Our SDKs are now available for multiple languages, and all of the source in them is published under an Apache 2 license and and maintained in GitHub repositories. The .NET SDK for Azure has in particular seen a bunch of great improvements with today’s release, and now includes tooling support for both VS 2010 and the VS 2012 RC. We are also now shipping Windows, Mac and Linux SDK downloads for languages that are offered on all of these systems – allowing developers to develop Windows Azure applications using any development operating system. Much, Much More The above is just a short list of some of the improvements that are shipping in either preview or final form today – there is a LOT more in today’s release.  These include new Virtual Private Networking capabilities, new Service Bus runtime and tooling support, the public preview of the new Azure Media Services, new Data Centers, significantly upgraded network and storage hardware, SQL Reporting Services, new Identity features, support within 40+ new countries and territories, and much, much more. You can learn more about Windows Azure and sign-up to try it for free at http://windowsazure.com.  You can also watch a live keynote I’m giving at 1pm June 7th (later today) where I’ll walk through all of the new features.  We will be opening up the new features I discussed above for public usage a few hours after the keynote concludes.  We are really excited to see the great applications you build with them. Hope this helps, Scott

    Read the article

  • ATI Radeon 5850 I cant seem to get 3 monitors up at the same time. 2 Monitors and a HDTV but still.

    - by Jan
    Ive just bought the top end ATI Radeon card with 2 normal monitor ports and a HDMI. The idea was to continue using my dual screen setup as always and to use the last plug, the HDMI on my TV. I got a new 52 inch HD TV with all the necessary bits. This should work fine. But.. in Display Properties I still get only my 2 monitors up as options. Not the Digital TV. When I unplug 1 monitor and restart the computer, I get the TV and the other monitor. But never all 3 at the same time. Why is this ? Where can I go to tell it that I need all 3 screens at the same time. Also I get a message saying my gfx card also gives sound through the HDMI cable.. But the TV tells me its recieving a sound format that it does not understand. Any ideas on that too while were at it ?

    Read the article

  • vsftpd: refusing to run with writable root inside chroot

    - by MrROY
    I want to setup a anonymous only ftp server (able to upload files). Here is my config file: listen=YES anonymous_enable=YES anon_root=/var/www/ftp local_enable=YES write_enable=YESr. anon_upload_enable=YES anon_mkdir_write_enable=YES xferlog_enable=YES connect_from_port_20=YES chroot_local_user=YES dirmessage_enable=YES use_localtime=YES secure_chroot_dir=/var/run/vsftpd/empty rsa_cert_file=/etc/ssl/private/vsftpd.pem pam_service_name=vsftpd But when i try to connect it: kan@kan:~$ ftp yxxxng.bej Connected to yxxx. 220 (vsFTPd 2.3.5) Name (yxxxg.bej:kan): anonymous 331 Please specify the password. Password: 500 OOPS: vsftpd: refusing to run with writable root inside chroot() Login failed Can anyone help ?

    Read the article

  • Exchange 2003 HTTP Account Error

    - by Ryaner
    We are trying to get one of our users connected to our Exchange 2003 server using the HTTP method as they already have an existing Exchange account on another server. The setup goes through and they appear to get connected fine however none of the subfolders are listed. Instead we get one folder of "Error-Pls file a Bug". The usual Google search just throws up nothing useful. Does anyone know how to fix this? Or has anyone actually gotten Outlook (2003 or 2007) to connect to an Exchange 2003 server?

    Read the article

  • Installing Domain Controller on Hyper-V Host

    - by MichaelGG
    Given a resource limited setup consisting of 2 host machines (HyperV-01 and HyperV-02), is it OK to put the domain controllers in parent partition, instead of their own VM? The main reason is that if the DCs go into a child partition, starting from cold on both machines could lead to a bit of an issue, as there'd be no DCs around until well after both parents have booted. I'm guessing this might cause undesirable effects. Am I correct to be worried about joining the host systems to a domain that's only on VMs? The biggest drawback I've heard so far is that if AD gets heavily used, its resources could cut into HyperV's. I'm not concerned about that for this deployment. Any other suggestions? (Besides finding a 3rd machine and running AD on it.)

    Read the article

  • Change Linux Console's Default Monitor

    - by Tim M
    Is there any way to specify which monitor the console is displayed on in Linux? Details: I have a 3 monitor setup with 2 video cards. When I boot the computer, the BIOS displays on the PCI graphics card (which has a small monitor). When starting Linux, the console is displayed on the same monitor. Is there a way to have the console output on a different monitor? I'm using the vesafb framebuffer. I don't see a way in my BIOS to change the default video card.

    Read the article

  • Setting jQuery after ASP.net AJAX partial post back

    - by Steve Clements
    OK, so for some reason you have a mega mashup solution with ASP.net AJAX, jQuery and web forms.  Perhaps you are just on the migration from AjaxControlToolkit to the jQuery UI framework – who knows!! Anyway, the problem is that when you post back with something like an UpdatePanel, you will find that your nicely setup jQuery stuff, like the datepicker for example will no longer work. You may have something like this… $(document).ready(function () {     $(".date-edit").datepicker({ dateFormat: "dd/mm/yy", firstDay: 1, showOtherMonths: true, selectOtherMonths: true }); });   When you’re ASP.net UpdatePanel post back, you will find that your datepicker has gone.  Bugger! Well you need to add this little gem to set it back up again once the UpdatePanel comes back to the page. var prm = Sys.WebForms.PageRequestManager.getInstance(); prm.add_endRequest(function () {     $(".date-edit").datepicker({ dateFormat: "dd/mm/yy", firstDay: 1, showOtherMonths: true, selectOtherMonths: true }); });   Or like me, you would have a javascript function, something like InitPage(); do all your work in there and call that on document.ready and endRequest. Your choice…you have the power   Share this post :

    Read the article

  • SVN: Error validating server certificate for svn hook linux

    - by Dr Casper Black
    Hi, I managed to setup a SVN (over SSL) server and TortoiseSVN client on Win. I made a Post-Commit Hook for test project. The Post-Commit will update the web dir so the App in PHP can be executed with the newest version. It all works when done over shell. The only problem is, when i commit the changes over the client in Win the change is commited but HOOK throws error post-commit hook failed (exit code 1) with output: Error validating server certificate for 'https://SERVER_IP:443': - The certificate is not issued by a trusted authority. Use the fingerprint to validate the certificate manually! - The certificate hostname does not match. Certificate information: - Hostname: DEVSRVR - Valid: from Fri, 28 Jan 2011 09:22:45 GMT until Sat, 28 Jan 2012 09:22:45 GMT - Issuer: PHP, SS, SS, SRB - Fingerprint: 5f:d0:50:d6:dd:a6:d4:64:a5:ac:3a:4b:7c:7d:33:e3:75:dd:23:9f (R)eject, accept (t)emporarily or accept (p)ermanently? svn: OPTIONS of 'https://SERVER_IP/svn/myproject/trunk': Server certificate verification failed: certificate issued for a different hostname, issuer is not trusted (https://SERVER_IP)

    Read the article

< Previous Page | 324 325 326 327 328 329 330 331 332 333 334 335  | Next Page >