Search Results

Search found 29108 results on 1165 pages for 'generic test'.

Page 230/1165 | < Previous Page | 226 227 228 229 230 231 232 233 234 235 236 237  | Next Page >

  • how to remotely open an URL in Firefox in a specific profile?

    - by miernik
    I have several instances of Firefox with several different profiles running. Among them profiles with the names "software" and "test". I am trying to open an URL from a bash script to have it open in profile "test", like this: firefox -P "test" http://www.example.org/ However that opens it in profile "software" anyway. Any ideas? Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.8) Gecko/20100308 Iceweasel/3.5.8 (like Firefox/3.5.8)

    Read the article

  • Testing php mail() in localhost problem.

    - by Samir Ghobril
    Hey guys, recently I just installed msmtp in linux and I even send a mail from the terminal and it worked: echo -e "Subject: Test Mail\r\n\r\nThis is a test mail" |msmtp --debug --from=default -t [email protected] But in php, after editing the php.ini file to have this: sendmail_path = '/usr/bin/msmtp -t' and using this piece of code: <?php if ( mail ( '[email protected]', 'Test mail from localhost', 'Working Fine.' ) ){ echo 'Mail sent'; } else{ echo 'Error. Please check error log.'; } ?> I get the Mail sent message but don't receive a message in my inbox. Not even in the spam folder. Anything wrong I'm doing?

    Read the article

  • Nginx location to match query parameters

    - by Dave
    Is it possible in nginx to have a location {} block that matches query parameters. For example I want to pick up that "preview=true" in this url and then instruct it to do several different things, all possible in a location block. http://192.158.0.1/web/test.php?hello=test&preview=true&another=var The problem I'm having is that my test stuff doesn't seem to match, it seems like I can only match the URL itself? E.g. location ~ ^(.*)(preview)(.*)$ Or something aloong those lines?

    Read the article

  • puppet agent doesn't retrieve files from master

    - by nicmon
    I have a very basic question regarding to Puppet 3.0.1 configuration. I setup a puppet master server (CentOS) with 2 agents (CentOS and Windows 7), all 3 can ping and access each other. There is no error at all. I have copied a file under /etc/puppet/files/test2.txt my site.pp (/etc/puppet/manifests) contains these lines: node default { include test file { "/tmp/testmaster.txt": owner => root, group => root, mode => 644, source => "puppet:///files/test2.txt" } } but there will no file be created on agent servers under /tmp/ once I run "puppet agent --test" here is the output: [root@agent1 ~]# puppet agent --test Info: Retrieving plugin Info: Caching catalog for agent1.mydomain.com Info: Applying configuration version '1354267916' Finished catalog run in 0.02 seconds "puppet apply /etc/puppet/manifests/site.pp" creates the testmaster.txt under /tmp/ on master.

    Read the article

  • Apache - Restrict to IP not working.

    - by Probocop
    Hi, I've a subdomain that I only want to be accessible internally; I'm trying to achieve this in Apache by editing the VirtualHost block for that domain. Can anybody see where I'm going wrong? Note, my internal IP address here are 192.168.10.xxx. My code is as follows: <VirtualHost *:80> ServerName test.epiphanydev2.co.uk DocumentRoot /var/www/test ErrorLog /var/log/apache2/error_test_co_uk.log LogLevel warn CustomLog /var/log/apache2/access_test_co_uk.log combined <Directory /var/www/test> Order allow,deny Allow from 192.168.10.0/24 Allow from 127 </Directory> </VirtualHost> Thanks

    Read the article

  • How to make a iOS plugin for Unity3d

    - by DannoEterno
    I've passed last 2 days reading articles and book for understand how can i make a plugin for iOS in Unity. Basically i need just a demo for understand how it work. For now i've tried to make this process (with really poor luck): I've started a new project in Unity and writed a simple script using UnityEngine; using System.Collections; using System; using System.Runtime.InteropServices; public class CallPlugin : MonoBehaviour { [DllImport ("__Internal")] private static extern int test(); void Start () { Debug.Log(test()); } } Then i've created a project in Xcode with this simple script: extern "C"{ int test() { int che = 5; return che; } } Then i've tried: to put the .mm and .h in the Assets/Plugins/iOS = nothing to build the unity project and than add the .h and .mm in the Xcode project = nothing In Unity i will always get the EntryPointNotFoundException, so unity see the file but is unable to reach the method. The problem is... how?! :) Maybe i miss something or i've done something wrong? Thanks a lot for every help that you can give me :)

    Read the article

  • Cannot get ATI Drivers installed

    - by bittoast67
    I am trying to install the Catalyst driver. The best I can get is a strange resolution problem and firefox acts all wonkt. The worst I have gotten is low graphics mode in which I just reinstall Ubuntu. I have a HP Pavilion Dv7 laptop. With Radeon 3200 HD. I plan to try again with a fresh install of Ubuntu 12.4.3 as I have heard its the most compatible. This is what I have done: I have tried just the easy way of going to synaptic and installing the drivers that way. the fglrx package (not the fglrx update). And if memory serves I think that boots me into low graphics mode. So, fresh install of Ubuntu and tried again. I have done everything a couple times from this site (http://wiki.cchtml.com/index.php/Ubuntu_Precise_Installation_Guide) following every instruction to a T. That gets me something, such as a lowered fan speed and a much cooler computer, but I also lose most of my resolution. And displays says its the best resolution I can get. I also have a very screwy firefox. Using this method I can see AMD Catalyst Control Center in my dash (two of them really one administrator and one not) but when I try to open it it says no amd driver detected. So again, ubuntu reinstall. I have tried the GUI method from the Legacy driver I got from AMD's site. It runs through smoothly and at the very end after I exit the installer it gives me an error. I have also tried various other methods using terminal, as well as various different drivers (the one from the amd's site and the one suggested in the above link for my graphics card) both to no avail. When I try the method in the link on number 2, and I get the super low res and screwy fire fox. I type in, fglrxinfo ,and get a badrequest error. I have yet to type in fglrxinfo and get anything like what I am supposed to. UPDATE: I am now currently reinstalling Ubuntu 12.4. I tried the above mentioned link - thank you very much!- just to see on the previously failed driver attempt by following the purge commands. And to no avail when typing fglrxinfo I still get the badrequest thing. I will update again after a try with a true fresh install. Thanks again!! UPDATE: Alright everyone. Still no go. I have done everything word per word in the provided tutorial. I have rebooted my computer again to a fucked up resolution and this is what I get when typing fglrxinfo: $ fglrxinfo X Error of failed request: BadRequest (invalid request code or no such operation) Major opcode of failed request: 153 (GLX) Minor opcode of failed request: 19 (X_GLXQueryServerString) Serial number of failed request: 12 Current serial number in output stream: 12 I would like to add that when installing this file: fglrx_8.970-0ubuntu1_amd64.deb I got this: Building initial module for 3.8.0-29-generic Error! Bad return status for module build on kernel: 3.8.0-29-generic (x86_64) Consult /var/lib/dkms/fglrx/8.970/build/make.log for more information. update-initramfs: deferring update (trigger activated) Processing triggers for ureadahead ... Processing triggers for bamfdaemon ... Rebuilding /usr/share/applications/bamf.index... Processing triggers for initramfs-tools ... update-initramfs: Generating /boot/initrd.img-3.8.0-29-generic Processing triggers for libc-bin ... ldconfig deferred processing now taking place Any ideas? Anyone? I cant for the life of me figure out what I am doing wrong.

    Read the article

  • Do you write unit tests for all the time in TDD?

    - by mcaaltuntas
    I have been designing and developing code with TDD style for a long time. What disturbs me about TDD is writing tests for code that does not contain any business logic or interesting behaviour. I know TDD is a design activity more than testing but sometimes I feel it's useless to write tests in these scenarios. For example I have a simple scenario like "When user clicks check button, it should check file's validity". For this scenario I usually start writing tests for presenter/controller class like the one below. @Test public void when_user_clicks_check_it_should_check_selected_file_validity(){ MediaService service =mock(MediaService); View view =mock(View); when(view.getSelectedFile).thenReturns("c:\\Dir\\file.avi"); MediaController controller =new MediaController(service,view); controller.check(); verify(service).check("c:\\Dir\\file.avi"); } As you can see there is no design decision or interesting code to verify behaviour. I am testing values from view passed to MediaService. I usually write but don't like these kind of tests. What do yo do about these situations ? Do you write tests for all the time ? UPDATE : I have changed the test name and code after complaints. Some users said that you should write tests for the trivial cases like this so in the future someone might add interesting behaviour. But what about “Code for today, design for tomorrow.” ? If someone, including myself, adds more interesting code in the future the test can be created for it then. Why should I do it now for the trivial cases ?

    Read the article

  • Why are some UDP packets getting blocked?

    - by Tom
    In our organization, we have two test machines running Windows XP. While attempting to test a roll-my-own UDP message server, I found that both could receive small messages (under 2k) just fine. However, when I test sending large packets to both of these machines, one receives them fine, while the other can't receive them at all. Both machines have SP3 and both have their Windows Firewall shut off, but one still isn't working. Can anyone tell me where to look for anything that might be blocking or limiting the packet size on a Windows Machine? Thanks.

    Read the article

  • NetDiag + TCP Blocking?

    - by CrazyNick
    We are facing some issue with the sharepoint 2007 timer jobs everyday at a specific time, so decide to track the tcp blocking informartion during those hours using NetDiag tool. We are not able to find the required information if we uses "netdiag /test:ipsec", what is the command that can be used to pull the TCP blocking information and how to configure it? if i ran the command "netdiag /test:ipsec /debug" it is returning "IP Security test . . . . . . . . . : Skipped", what does it mean?

    Read the article

  • NetDiag + TCP Blocking?

    - by CrazyNick
    We are facing some issue with the sharepoint 2007 timer jobs everyday at a specific time, so decide to track the tcp blocking informartion during those hours using NetDiag tool. We are not able to find the required information if we uses "netdiag /test:ipsec", what is the command that can be used to pull the TCP blocking information and how to configure it? if i ran the command "netdiag /test:ipsec /debug" it is returning "IP Security test . . . . . . . . . : Skipped", what does it mean?

    Read the article

  • Why is my wireless so slow compared to my wired download speed?

    - by Shawn
    I just used speedtest.net (using Firefox) to compare my wired connection speed with my wireless connection speed. With my current contract (with Videotron), I'm supposed to get Download speed: 8Mbps Upload speed: 1Mbps Here are the results of the speedtest.net test: Wired Ping: 14ms Download speed: 8.41Mbps Upload speed: 1.04Mbps Wireless Ping: 16ms Download speed: 0.18Mbps Upload speed: 0.98Mbps The difference in download speeds seems staggering to me since I did the test 1 meter aways from my router. Any clue as to why my wireless download speed is so low compared to my wired download speed? using Ubuntu 11.04 on an Acer Aspire 5536-5519 Oh and it might be worth mentioning that my girlfriend has no trouble at all with her wireless connection. No slowness at all. (She uses Firefox on Windows 7 on a Dell) Here's the results for the same test on her system: Ping: 22ms Download speed; 8.44Mbps Upload speed: 1.02Mbps

    Read the article

  • Prepare For Oracle Certification Exams With Confidence

    - by Brandye Barrington
    Empower yourself to put your best foot forward on exam day! Oracle Certification Exam Candidates, test with confidence using preparation tools created by Oracle and Oracle's only Authorized Practice Test Provider, Kaplan SelfTest. Oracle wants to help protect your investment of time and money by offering tools to help you be as prepared as possible for your certification exam as well as your future job role. Use these valuable tools to get the most out of your exam preparation: Online Exam Preparation Seminars, Online Practice Tests and the new free Online Demos from Kaplan SelfTest. FREE ONLINE DEMOS Choose from 1Z0-851 Java 6 Programmer Certified Professional or 1Z0-047 Oracle Database SQL Expert. Get a feel for the type and difficulty of questions on the Oracle Certification exams and determine if you are ready for the exam or if you need more preparation. This is a powerful tool that will help you plan your preparation and make the most of your investment. Access Free Online Demos Now ONLINE EXAM PREPARATION SEMINARS These one-day self-paced streaming video seminars are 100% focused on exam preparation. The streaming video format lets you fast forward, rewind, and replay at your own pace so that you can identify and close any knowledge gaps before taking the exam. The Exam Prep Seminar structures your studying - so you don't have to. Access Online Exam Preparation Seminars ONLINE PRACTICE TESTS Test your knowledge with Kaplan SelfTest Practice Exams. These practice tests are one of the most effective ways to prepare for your Oracle Certification exam by helping you self-assess your knowledge using realistic exam simulations. You can purchase practice exams from Oracle with 30-day or 12-month access. Access Online Practice Tests Approach exam day with confidence using the tools above.

    Read the article

  • Sending files through ssh

    - by Frion3L
    I need to send files to a server using ssh. I have never used ssh so this is being really frustrating to me. Mention the client (me) is using windows and the server is using Ubuntu. I connected to the server using ssh2 ip, and then loging with an account I have. Now, I would like to send my files to a folder in the server, so, I moved to the folder and I used this command: scp test.txt user_name@host_direction server_folder_destination And it always return that it can't do 'stat' over test.txt, the file doesn't exist, and so. I'm assuming ssh2 can't see the file in my computer root (C:), so I tried to specifie more and added: C:\test.txt, but apear the same error. I don't know what is happening. Any hints please? Thanks

    Read the article

  • Why doesn't this batch file work for me?

    - by asdfg
    The following batch file is not woking. @echo off python -c "print('echo text')" %TEMP%\test.bat call %TEMP%\test.bat Can anyone help me with this?. edit: I needed unix eval functionality in windows. I could not find a direct way though. So I redirected the eval string to a temporary batch file and executed it. The temporary batch file was successfully created but calling it did not work in the above case. I noticed that any command after the test.bat creation did not work.

    Read the article

  • virtual hosts on lighttpd can't load

    - by Jake
    Thats what I did: Added following code to lighttpd.conf $HTTP["host"] =~ "(^|\.)test\.com$" { server.document-root = "/home/test" } created /home/test Restarted Lighttpd but it doesn't load anything Google chrome Error: No data received Unable to load the webpage because the server sent no data. Here are some suggestions: Reload this webpage later. Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data. Firefox: The connection was reset The connection to the server was reset while the page was loading. The site could be temporarily unavailable or too busy. Try again in a few moments. If you are unable to load any pages, check your computer's network connection. If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the Web. can you please explain How can I fix this? Really Thanks

    Read the article

  • SSH automatic logon works for one user but not the other

    - by tinmaru
    I want to enable automatic ssh login using the .ssh/config file for my git user. Here is my .ssh/config file: Host test HostName myserver.net User test IdentityFile ~/.ssh/id_rsa Host git HostName myserver.net User git IdentityFile ~/.ssh/id_rsa It works for my test user but not for my git user so my global SSH configuration is correct. The configuration are exactly the same as far as I know. It used to work with git user but I'm unable what change has broken the automatic logon. When I type: ssh -v git I get the following log: ... debug1: Authentications that can continue: publickey,password debug1: Next authentication method: publickey Offering RSA public key: /Users/mylocalusername/.ssh/id_rsa debug1: Authentications that can continue: publickey,password debug1: Next authentication method: password [email protected]'s password: _ Does anyone know what could be a possible difference

    Read the article

  • Javascript Module pattern with DOM ready

    - by dego89
    I am writing a JS Module pattern to test out code and help me understand the pattern, using a JS Fiddle. What I can't figure out is why my "private methods" on line 25 and 26, when referenced via DOM ready, have a value of undefined. JSFiddle Code Sample: var obj = { key: "value" }; var Module = (function () { var innerVar = "5"; console.log("obj var in Module:"); console.log(obj); function privateFunction() { console.log("privateFunction() called."); innerFunction(); function innerFunction() { console.log("inner function of (private function) called."); } } function _numTwo() { console.log("_numTwo() function called."); } return { test: privateFunction, numTwo: _numTwo } }(obj)); $(document).ready(function () { console.log("$ Dom Ready"); console.log("Module in Dom Ready: "); console.log(Module.test()); });

    Read the article

  • How to automatically mount a folder and change ownership from root in virtualbox

    - by Fiztban
    It is my first time using virtualbox and ubuntu (14.04), I am on a host Windows 7 OS. I am trying to mount a shared folder that has files I need to access both in the virtualbox and on the windows OS. I have successfully mounted them using the vboxsf from the Guest Additions installed. To mount I used the command sudo mount -t vboxsf <dir name in vbox> <directory in linux for example I used sudo mount -t vboxsf Test /home/user/Test I found several ways of mounting the directories automatically upon startup using for example the /etc/rc.local method (here) where you modify said file appending the command to it (without sudo). Or by using the fstab method (here). I prefer the rc.local method personally. Once mounted it has permissions dr-xr-xr-x however once mounted the directory is of root ownership and chown user /home/user/Test has no effect. This means I cannot make or change files in it as a normal user. In the VirtualBox the directory to be shared is not set as read-only. Is there a way to automatically mount the shared folder and assign ownership to my non root user?

    Read the article

  • Script to run chown on all folders and setting the owner as the folder name minus the trailing /

    - by Shikoki
    Some numpty ran chown -R username. in the /home folder on our webserver thinking he was in the desired folder. Needless to say the server is throwing a lot of wobbelys. We have over 200 websites and I don't want to chown them all individually so I'm trying to make a script that will change the owner of all the folders to the folder name, without the trailing /. This is all I have so far, once I can remove the / it will be fine, but I'd also like to check if the file contains a . in it, and if it doesn't then run the command, otherwise go to the next one. #!/bin/bash for f in * do test=$f; #manipluate the test variable chown -R $test $f done Any help would be great! Thanks in advance!

    Read the article

  • Kill program after it outputs a given line, from a shell script

    - by Paul
    Background: I am writing a test script for a piece of computational biology software. The software I am testing can take days or even weeks to run, so it has a recover functionality built in, in the case of system crashes or power failures. I am trying to figure out how to test the recovery system. Specifically, I can't figure out a way to "crash" the program in a controlled manner. I was thinking of somehow timing a SIGKILL instruction to run after some amount of time. This is probably not ideal, as the test case isn't guaranteed to run the same speed every time (it runs in a shared environment), so comparing the logs to desired output would be difficult. This software DOES print a line for each section of analysis it completes. Question: I was wondering if there was a good/elegant way (in a shell script) to capture output from a program and then kill the program when a given line/# of lines is output by the program?

    Read the article

  • Is it OK to introduce methods that are used only during unit tests?

    - by Mchl
    Recently I was TDDing a factory method. The method was to create either a plain object, or an object wrapped in a decorator. The decorated object could be of one of several types all extending StrategyClass. In my test I wanted to check, if the class of returned object is as expected. That's easy when plain object os returned, but what to do when it's wrapped within a decorator? I code in PHP so I could use ext/Reflection to find out a class of wrapped object, but it seemed to me to be overcomplicating things, and somewhat agains rules of TDD. Instead I decided to introduce getClassName() that would return object's class name when called from StrategyClass. When called from the decorator however, it would return the value returned by the same method in decorated object. Some code to make it more clear: interface StrategyInterface { public function getClassName(); } abstract class StrategyClass implements StrategyInterface { public function getClassName() { return \get_class($this); } } abstract class StrategyDecorator implements StrategyInterface { private $decorated; public function __construct(StrategyClass $decorated) { $this->decorated = $decorated; } public function getClassName() { return $this->decorated->getClassName(); } } And a PHPUnit test /** * @dataProvider providerForTestGetStrategy * @param array $arguments * @param string $expected */ public function testGetStrategy($arguments, $expected) { $this->assertEquals( __NAMESPACE__.'\\'.$expected, $this->object->getStrategy($arguments)->getClassName() ) } //below there's another test to check if proper decorator is being used My point here is: is it OK to introduce such methods, that have no other use than to make unit tests easier? Somehow it doesn't feel right to me.

    Read the article

  • Is there a better way to organize my module tests that avoids an explosion of new source files?

    - by luser droog
    I've got a neat (so I thought) way of having each of my modules produce a unit-test executable if compiled with the -DTESTMODULE flag. This flag guards a main() function that can access all static data and functions in the module, without #including a C file. From the README: -- Modules -- The various modules were written and tested separately before being coupled together to achieve the necessary basic functionality. Each module retains its unit-test, its main() function, guarded by #ifdef TESTMODULE. `make test` will compile and execute all the unit tests, producing copious output, but importantly exitting with an appropriate success or failure code, so the `make test` command will fail if any of the tests fail. Module TOC __________ test obj src header structures CONSTANTS ---- --- --- --- -------------------- m m.o m.c m.h mfile mtab TABSZ s s.o s.c s.h stack STACKSEGSZ v v.o v.c v.h saverec_ f.o f.c f.h file ob ob.o ob.c ob.h object ar ar.o ar.c ar.h array st st.o st.c st.h string di di.o di.c di.h dichead dictionary nm nm.o nm.c nm.h name gc gc.o gc.c gc.h garbage collector itp itp.c itp.h context osunix.o osunix.c osunix.h unix-dependent functions It's compile by a tricky bit of makefile, m:m.c ob.h ob.o err.o $(CORE) itp.o $(OP) cc $(CFLAGS) -DTESTMODULE $(LDLIBS) -o $@ $< err.o ob.o s.o ar.o st.o v.o di.o gc.o nm.o itp.o $(OP) f.o where the module is compiled with its own C file plus every other object file except itself. But it's creating difficulties for the kindly programmer who offered to write the Autotools files for me. So the obvious way to make it "less weird" would be to bust-out all the main functions into separate source files. But, but ... Do I gotta?

    Read the article

  • How to get Apache to follow symlink instead of downloading it?

    - by user792445
    I am just using the standard apache config file which mentions that it follows symlinks, but when I hit the url http://localhost/test it downloads the symlink file instead of following it. What config do I need to change to get apache to follow the symlink instead of downloading it? This is an ls on the directory: $ ls -al total 10 drwx------+ 1 SYSTEM SYSTEM 0 Oct 20 10:55 . drwx------+ 1 SYSTEM SYSTEM 0 Aug 26 12:27 .. -rw-r--r--+ 1 me None 47 Oct 20 10:14 index.html lrwxrwxrwx 1 me None 29 Oct 19 17:10 test -> /home/me/projects/test This is in my apache config file: <Directory "D:/Program Files (x86)/Apache Software Foundation/Apache2.2/htdocs"> Options Indexes FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory>

    Read the article

  • local wordpress installation not accessible from the outside world

    - by hello
    I have a working installation of wordpress located in /var/www/html/wordpress It is accessible in my local network at [local-machine-ip]/wordpress/ There is also a test page located in /var/www/html/test.html It is also accessible in my local network at [local-machine-ip] I would like the wordpress website to be accessible from the outside world. I know that my ISP blocks incoming requests on port 80, so I set my router to redirect requests from port 8080 to 80. This feature appears to be working correctly since I can access the test.html page using my public ip address as follows: [public-ip]:8080 However, I cannot access [public-ip]:8080/wordpress Here is my Apache config : <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/html ServerName [my.domain.com] <Directory /var/www/html/> Options FollowSymLinks Indexes MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Thanks!

    Read the article

< Previous Page | 226 227 228 229 230 231 232 233 234 235 236 237  | Next Page >