Search Results

Search found 10342 results on 414 pages for 'biztalk testing'.

Page 231/414 | < Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >

  • can't execute scripts compiled with shc

    - by serilain
    I'm trying to use SHC to compile a shell script so that I can set the SUID bit on it and obfuscate what it's doing (I'm attempting to have it run as part of all new users' .bashrc). As a test, I wrote a script that's simply: #!/bin/bash env And compiled it using shc -r -f script.sh However, when I try to run the resulting script by simply doing ./script.sh.x, even after setting it to 777 (just for testing purposes), I get "Operation not permitted; killed" unless I run it as sudo (which I don't want to have to do). Am I running afoul of some Ubuntu permissions that won't let me run binaries created by shc? Thanks!

    Read the article

  • Downgrade 'local' packages in Debian/Ubuntu

    - by Matt Joiner
    I recently unticked the "pre-released updates" option in Software Sources on my Ubuntu Lucid 10.04.1 installation. The Ubuntu wiki states the following regarding this source: The proposed updates are updates which are waiting to be moved into the recommended updates queue after some testing. They may never reach recommended or they may be replaced with a more recent update. Roughly 20 installed packages have indeed not made it into recommended updates, and occasionally cause conflicts when I install new software, as related packages of the newer version are unavailable now that I've disabled the source. How can I force a downgrade of all packages for which an earlier version exists in a enabled repository?

    Read the article

  • IE 8 doesn't appear to clear cache on demand. Is anyone else seeing this?

    - by Steve
    I have a client that uploads updated pdf files to her Concrete5 CMS, through the file manager, replacing the old file with the same name. She then does a cms "clear cache" and exits as she should. Then, in testing, she finds that the old file still comes up when clicking on the link. On further review, the cms file manager version tracking shows that the file has been updated, and, for me, the new file comes up, as it should, when clicking the link. My client hase also refreshed her browser cache and still, she only gets the old file when clicking on the link. She says that, while she can't seem to force an immediate cache update, overnight it appears to update. My client is also part of a large company-wide lan and intranet. Is it possible that there is a cache function placed outside of her local browser and cms cache that is not updating?

    Read the article

  • Mapping capslock to control on Mac OS X: works for some things, but not others?

    - by keflavich
    I've mapped my capslock key to control using the Modifier Keys mapping in System Preferences: Keyboard. I've also tried mapping to "right control" instead of "left control" as per http://hints.macworld.com/article.php?story=20060825072451882 using a plist editor. The mapping seems to work in all cases except one: I can't use capslock with left-shift to make key mappings or apparently do anything else. capslock (as control) with right-shift works. I'm primarily testing by using control-tab / control-shift-tab to switch between tabs. Using the on-screen-keyboard viewer, I can get capslock-shift-(just about anything) to work, but not capslock-leftshift-tab. My best guess is that somehow the particular keyboard I'm working on is faulty, but I'm curious whether anyone else can reproduce this or has any ideas.

    Read the article

  • Configure Raid On Red Hat 5

    - by Sopolin
    Hi all, I have a problem with configure raid on red hat enterprise linux. The problem is when I create raid on two hard disks. It works successfully but after I remove one hard disk. It works normally. It means that I plug in one hard disk for testing configure raid. But after that I put both hard disks and create other file. The raid is cleared. My question is: Why do I turn off server machine, it clears raid that I configure first time before I turn off? Could anyone help to solve this problem? Thank, Ung Sopolin

    Read the article

  • Remote kill, upload, execute file

    - by Masoud M.
    I'm developing a program and I need to upload my xyz.exe file to many host machines and execute them frequently. I need a server-client tool to do it as below steps after an update signal from my PC: Those host machine should kill any running processes with name xyz.exe. Download my new xyz.exe. Then execute new xyz.exe. I know about some tools like PsExec, but I need a tools with better user-interface and more powerful. Is there any tool to do it ? UPDATE: The systems are in a same LAN, OS is windows (XP or 7), No full remote access is needed. I'm a developer and my program should run in remote hosts and I'm testing my application.

    Read the article

  • How do i get Safari to ignore the SSL Certificate error?

    - by Tangopop
    In IE 6, 7, 8 and Firefox 3.6.3 and 3.0.5 i have installed a local SSL Certificate on the machine i am testing on and i have gotten the browser to igonre the SSL error (which is off one of my Web Test servers) Now i am tryin to do the same thing within safari 4 and with no luck. Basically i am running some automated scripts to test my website before they go live and i need to be able to ignore these errors as they will all run autonomosly. This is the error screen i am trying to avoid: http://library.bowdoin.edu/news/images/ezproxy-err/safari.jpg As i say i have installed the certificate locally and the IE 7 browser on the same machine works fine.

    Read the article

  • Apache on Mac Mavericks issue

    - by Michael
    Trying to run Apache so that I can create a testing server on my mac.When I start apache it starts, but it doesn't run (no connection to local host. Ill upload the unix,you'll see that after starting there is no processes, and I did a check to show you what was running on my port 80... I don't entirely know that means. Michaels-MacBook-Pro-3:~ michaelramos$ sudo apachectl start Michaels-MacBook-Pro-3:~ michaelramos$ ps aux | grep httpd michaelramos 348 0.0 0.0 2442000 624 s000 S+ 8:51AM 0:00.00 grep httpd Michaels-MacBook-Pro-3:~ michaelramos$ sudo apachectl start org.apache.httpd: Already loaded Michaels-MacBook-Pro-3:~ michaelramos$ sudo lsof -i ':80' COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME ocspd 96 root 18u IPv4 0x8402f926599c58df 0t0 TCP dhcp-92-67.radford.edu:49267->108.162.232.196:http (ESTABLISHED) ocspd 96 root 20u IPv4 0x8402f926599c58df 0t0 TCP dhcp-92-67.radford.edu:49267->108.162.232.196:http (ESTABLISHED) ocspd 96 root 21u IPv4 0x8402f926599c50f7 0t0 TCP dhcp-92-67.radford.edu:49268->108.162.232.206:http (ESTABLISHED) ocspd 96 root 23u IPv4 0x8402f926599c50f7 0t0 TCP dhcp-92-67.radford.edu:49268->108.162.232.206:http (ESTABLISHED)

    Read the article

  • Unresponsive virtual OS

    - by confusedGeek
    Hopefully someone has a suggestion on how to resolve this. Configuration Host: Win 2003R2 w/Virtual Server 2005R2 Virtual1: Win 2003R2 w/Sql Server 2005 Virtual2: Win 2003R2 w/WSS 3.0 Situation This past weekend the power went out and took down the servers (no UPS, it's a desktop standing in as dev testing server). Since the servers went down the Virtual2 server after running WSS fairly heavily for an hour to two will become unresponsive via HTTP. If I login via virtual server's remote control I don't get anything beyond a background screen. The CPU counter on the virtual server's master status shows that it isn't doing anything. The only thing I have been able to do is to turn off Virtual2, which loses any state changes. Shutdown commands issue from the virtual server master status are ignored. After restarting Virtual2 the event logs and application logs don't indicate what caused the problem. Anyone have an idea as to how to repair the OS, or maybe what could be the problem? Thanks ahead of time.

    Read the article

  • Virtualhost Wildcard Subdomains

    - by Khuram
    We have one static IP on which we have routed our company website. We have setup a local machine on windows with WAMP to run our testing server. We want virtual hosts to test our different apps. However, when creating subdomains, we have a new project which uses wildcard subdomains. How can we create the wildcard subdomains in VirtualHosts. We use, NameVirtualHost * <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/corporate" ServerName companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName project.companysite.com </VirtualHost> <VirtualHost *> ServerAdmin admin@test DocumentRoot "E:/Wamp/www/project" ServerName *.project.companysite.com </VirtualHost> However, the last * wildcard does not work. Any help?

    Read the article

  • PowerShell Remoting: No credentials are available in the security package

    - by TheSciz
    I'm trying to use the following script: $password = ConvertTo-SecureString "xxxx" -AsPlainText -Force $cred = New-Object System.Management.Automation.PSCredential("domain\Administrator", $password) $session = New-PSSession 192.168.xxx.xxx -Credential $cred Invoke-Command -Session $session -ScriptBlock { New-Cluster -Name "ClusterTest" -Node HOSTNAME } To remotely create a cluster (it's for testing purposes) on a Windows Server 2012 VM. I'm getting the following error: An error occurred while performing the operation. An error occurred while creating the cluster 'ClusterTest'. An error occurred creating cluster 'ClusterTest'. No credentials are available in the security package + CategoryInfo : NotSpecified: (:) [New-Cluster], ClusterCmdletException + FullyQualifiedErrorId : New-Cluster,Microsoft.FailoverClusters.PowerShell.NewClusterCommand All of my other remote commands (installing/making changes to DNS, DHCP, NPAS, GP, etc) work without an issue. Why is this one any different? The only difference is in the -ScriptBlock tag. Help!

    Read the article

  • Is it possible to use the IE10 App without making Internet Explorer the default browser?

    - by nhinkle
    Windows 8 comes with two versions of Internet Explorer: the normal desktop version, which looks just like IE9, and the Modern UI version, which is a full-screen tablet-style app. By default, links opened in desktop mode open in desktop IE, and links opened in Modern UI apps open in the full-screen app. When you set a new default browser (like Google Chrome, which has a Modern UI mode now), you can no longer access IE10 in the Modern UI at all - the tile disappears from the start screen, and there's no way to manually invoke it. I don't use IE10 much, but I'd like to have access to it in Metro mode, because it's handy for testing things out. I don't want to have IE be my default browser though. Is there any way to get the IE10 "App" to show up without setting IE to be the default browser everywhere?

    Read the article

  • OS X clients ignoring Windows print server permissions

    - by Ilumiari
    I'm in the process of testing a Windows Server 2008 R2 print server for a mixed OS X/Windows environment. Any security permissions (AD groups) I set for the printers on the print server are not honoured by the OS X clients. Only if I remove absolutely all permissions for a given printer will an OS X client not print to that printer. The Windows clients honour the permissions as expected. The PrintService log doesn't record any activity when an unprivileged Windows client attempts to print, and records a typical print job when an unprivileged OS X client attempts to print. Has anyone encountered this problem before and have a fix? With 600-700 clients, a number of which are dual-booting, restricting by IP address is not viable. EDIT: The jobs are definitely going through the print server, they show up in the logs with their AD credentials.

    Read the article

  • Maximum MTU size

    - by user192702
    Thought one of the issues I'm experiencing with the following question is due to MTU rightfully so. ESXi 5 VM Putty session hangs, vSphere client timing out However, when I tried testing the maximum MTU size it seems there's just no limit. Thought Ethernet only allows maximum MTU. But I'm up to 54450. ping -l 54450 192.168.10.7 Pinging 192.168.50.7 with 54450 bytes of data: Reply from 192.168.10.7: bytes=54450 time=1081ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Reply from 192.168.10.7: bytes=54450 time=1079ms TTL=62 Ping statistics for 192.168.10.7: Packets: Sent = 4, Received = 4, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 1079ms, Maximum = 1081ms, Average = 1079ms

    Read the article

  • Converting PDF eBooks into a Kindle format

    - by Ender
    Over the past couple of years I've amassed quite a collection of guides, tutorials and ebooks in PDF format. A lot of these are quite useful for work, especially PDF documentation, and rather than have to be at a computer every time I want to read how to do something in Sitecore or to read through a software testing ebook I'd like to do it on my brand-spanking-new Kindle. However, even though there is now a native PDF reader on the Kindle due to the nature of PDF's they are practically unreadable. The text doesn't wrap due to how PDF's are sized and so far after a bunch of Google searches I've yet to find a viable solution to get my PDF's converted into a readable Kindle format. Sometimes these books have code or pictures/tables in them, but most of the time they're text-heavy and to be honest I'd be surprised if there wasn't a free tool to handle the converting of PDF to one of the (seemingly many) Kindle formats. So, can anyone help me out with this?

    Read the article

  • ldirectord refusing connection when nginx redirects from http to https

    - by Adam
    I am running ldirector as a load balancer to an nginx front end server. If I setup a redirect from http to https and connect directly to the nginx server, all is well. Connecting via ldirector causes my connection to be refused. I can connect normally via http or https through ldirector when I don't have the redirect in place. To add to my confusion, if my application issues a redirect from http to https, it works. I am testing this via curl on the command line. (curl: (7) couldn't connect to host vs a response) I am using the standard ldirectord config (http://www.ultramonkey.org/3/topologies/config/lb/non-fwmark/linux-director/ldirectord.cf) the http and https parts. My nginx config for the redirect is simply: location / { rewrite ^(.*) https://$host$1 permanent; }

    Read the article

  • Exclude path from Windows 7 search?

    - by Jez
    It's really quite incredible how bad Windows 7's search seems to be. The latest problem I have with it is that I want to search for filenames including the string "user", so I go to the directory I want to search in (C:\Users\jez\testing), type *user* in the search box, and hit enter. It gives me... every single file. Because "user" is in the path of every file, with everything being under "C:\Users..." OK, this is useless. Is there a way to just search for the string within files rather than their paths, or do I need to download some decent 3rd party search software?

    Read the article

  • Linux servers vs Windows IIS sense of usage "free" solutions

    - by Rob
    I wonder what is the sense of using "free" open source solutions for serious webstie applications? Crawled and read many testing of servers performance and there is one conclusion: IIS seems to be the best choice for high load applicatiom. I mean cost effective. Especially this concers to Nginx PLUS and LiteSpeed Users where subscriptions paid for e.g. LoadBalacer and extra support cost a lot in fact. I'm asking then where it's "free" then or "cheap" in this case? Assuming even little higher cost of dedicated servers with Windows still seems like Windows looks cheaper. At it's basic setup Windows 2012 with IIS offer much more than std LAMP, or other NGINX config.... Maybe am I missing sth ? I mean only general case for someone who did not already started his app. I know exactly that the cheapest solution is the one someone is skilled. Has anyone done already such real costs calculation for example scenarios?

    Read the article

  • How to failover to local account on a cisco switch/router if radius server fails?

    - by 3d1l
    I have the following configuration on a switch that I testing for RADIUS authentication: aaa new-model aaa authenticaton login default group radius local aaa authentication enable default group radius enable aaa authorization exec default group radius local enable secret 5 XXXXXXXXX ! username admin secret 5 XXXXXXXXX ! ip radius source-interface FastEthernet0/1 radius-server host XXX.XXX.XXX.XXX auth-port 1812 acct-port 1813 key XXXXXXXXX radius-server retransmit 3 ! line con 0 line vty 5 15 Radius authentication is working just fine but if the server is not available I can not log into the router with the ADMIN account. What's wrong there? Thanks!

    Read the article

  • Development environment for embedded system

    - by Howard Lee Harkness
    I need to develop software in C/C++ for an embedded system. I have Debian 6 running off of a USB hard drive. I would like to be able to generate a stripped-down kernel with modules, and install them either on a CF card or a USB 'thumb' drive. I succeeded in building a Linux 3.6 kernel and running it in Debian off of the USB hard drive, but I am having trouble figuring out how to install it on the thumb drive. I would like a build cycle that looks like this: 1) Build module or kernel with desired software 2) Install it on thumb drive 3) Boot and test I would like to use the same system for both development and testing, if that is feasible. I am looking for resources and tutorials that would help me understand how to do this.

    Read the article

  • how to copy sql server database on same server

    - by Sam
    I've got a SQL Server 2008 and want to make a copy of a database so I've got a 2nd Version of the database for testing on the same server. The database copy wizard is not able to copy the database, it always sends funny error Messages about missing objects (using SMO copy). When I try to make a backup and restore it under a different database name it still keeps the file names of the original database and overrides this (crashing the original database). So how do I copy a SQL database? Shutdown SQL Server, copy the physical files and attach them? Maybe a command line tool for database copy? Shouldn't there be an easy way to make a copy?

    Read the article

  • Add custom Virtual Machine icons to VirtualBox

    - by Iszi
    I'd like to use custom icons to better distinguish machines running the same OS from each other, in VirtualBox. is this possible? If so, what file(s) do I need to add/edit? Examples: I've got two Windows 7 VMs. One I use as a sandbox for testing various things, and the other I use for when I need to connect to work (ideally, my personal system - the host machine - never directly connects). I'd like to have perhaps a beaker for the sandbox, and a suitcase for the work machine. I've got two Ubuntu VMs. One is BackTrack Linux, the other is a build I'm using to learn more about the OS. I wouldn't mind keeping the regular icon for the latter, but I'd like to use one of BackTrack's icons or images for the former. I'm running VirtualBox 4.1.6 on Windows 7 x64.

    Read the article

  • DB auto failover in c# does not work when the principal server physically goes offline

    - by user62521
    I'm setting up DB auto failover in C# with SQL Server 2008 and I have a 'high safety with automatic failover mirror' using a witness setup and my connection string looks like "Server=tcp:DC01; Failover Partner=tcp:DC02; database=dbname; uid=sewebsite;pwd=somerndpwd;Connect Timeout=10;Pooling=True;" During testing, when I turn off the SQL Server service on the principal server the auto failover works like a charm, but if I take the principal server offline (by shutting down the server or killing the network card) auto failover does not work and my website just times out. I found this article where the second last post suggests that its because we are using named pipes which does not work when the principal goes offline, but we force TCP in our connection string. What am I missing to get this DB auto failover working?

    Read the article

  • Why is my hosts file not working?

    - by elliot100
    I've been using the hosts file to for local website development, and it's recently stopped working. No entries other than localhost resolve. I've simplified to test, so it now contains only 127.0.0.1 localhost ::1 localhost 127.0.0.1 test.dev localhost responds to ping, test.dev does not. The file is called hosts with no extension It has no trailing spaces It's saved in C:\WINDOWS\System32\drivers\etc which matches the value of HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\DataBasePath Oddly, despite UAC being on, I can edit, delete and save the file without admin permissions No proxy is being used, PC is not connected to network for testing Stopping the DNS Client service seemed to resolve the issue for a few minutes, test.dev briefly resolved but doesn't any more. Only firewall is Windows' Machine has been restarted. Is there anything else I should try?

    Read the article

  • Postfix, saslauthd, mysql, smtp authentication problems

    - by italiansoda
    Trying to get authentication on my mail server (ubuntu 10.04) running but am having trouble. I have a server with postfix for smtp setup, imap server with courier setup. My postfix authentication is using cyrus (I haven't tried dovecot really) saslauth. The user name and password is stored in a MySql database. Logging in with imap-ssl works on a remote client (thunderbird), and I can read my mail. I can't get the SMTP side working, and have focused the issue down to saslauth. Testing with testsaslauthd -u 'username' -p 'passowrd' -s smtp returns connect() : Permission denied the password in the database is encrypted and I guess this testsaslauthd will take a plain text password and encrypt it. Looking for someone to walk me through getting this working. Im new to the mail server, and have never got one fully working. Thanks. Ask me which log files I should look at/post, which tests to run, permissions to check.

    Read the article

< Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >