Search Results

Search found 3938 results on 158 pages for 'goal'.

Page 43/158 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • Requiring SSH-key Login Via PAM From Specific IP Ranges

    - by Sean M
    I need to be able to access my server (Ubuntu 8.04 LTS) from remote sites, but I'd like to worry a bit less about password complexity. Thus, I'd like to require that SSH keys be used for login instead of name/password. However, I still have a lot to learn about security, and having already badly broken a test box when I was trying to set this up, I'm acutely aware of the chance of screwing myself while trying to accomplish this. So I have a second goal: I'd like to require that certain IP ranges (e.g. 10.0.0.0/8) may log in with name/password, but everyone else must use an SSH key to log in. How can I satisfy both of these goals? There already exists a very similar question here, but I can't quite figure out how to get to what I want from that information. Current tactic: reading through the PAM documentation (pam_access looks promising) and looking at /etc/ssh/sshd_config.

    Read the article

  • QoS / PBR Routing Questions

    - by Bernard
    I have a 50Mbs Satellite link and a 10Mbs Microwave link supplying a very remote location. Behind these links, I have a 6,400 seat network - with about 3,000 signed in at any one time. My goal is to send all of the Voip traffic (Google Chat, Magic Jack, Skype, Speakeasy, Vonage, Vonage PC, Yahoo) through the microwave link which has 100ms latency. The rest of the traffic can utilize any remaining bandwidth of the microwave link with excess being diverted to the higher latency (600ms) satellite connection. The problem I've had so far is that most automatic routing configurations weigh the bandwidth heavily for preference - and I'm only wanting latency considered. Additionally, I don't know if this can even be handled with the routing hardware I have at my disposal (Cisco 3640, 3745, & 3845). Any recommendations (or really good starting points) would be greatly appreciated.

    Read the article

  • CentOS 5.8 server, installed Web Server vs yum install httpd

    - by Shiro
    I would like to know, what is the different between, the build-in CentOS server web server vs I install manual with yum install httpd I am new in CentOS. I just installed my CentOS 5.8 Server with Web Server checked. With some Google Search with LAMP installation, they listed with yum install httpd I had check inside /etc/init.d/ already had httpd I try to run yum install httpd http-devel It shows not yet installed. What will happen if I install yum version? What should I do? What is the best practise? Should I remove the Web Server install by default with CentOS? My goal is install LAMP (PHP v5.2.17).

    Read the article

  • Prebuilt ActiveMQ Server Based off ZeroMQ

    - by VxJasonxV
    Are there any distributions of fully built Message Brokers that are initially based off of ZeroMQ? I had thought that downloading/installing ZeroMQ would give me such, not just a handful of procedures for rolling me own. Currently we use ActiveMQ, but it is a miserable pain to configure, so I'd rather slim down the profile, unfortunately I learned that ZeroMQ was not a one step solution to achieving that goal. Alternatives are ok, but I'd prefer something less overly verbose in configuration than ActiveMQ's ludicrous amounts of Java configuration. Broker config + Java Logger Config + many other intricacies that I don't wish to deal with. (Read: Preferably not Java based in the first place.) I'm looking for reliable, basic functionality described by JMS brokers. Topics, Queues, Message Persistence, etc.

    Read the article

  • Enable basic auth sitewide and disabling it for subpages?

    - by piquadrat
    I have a relatively straight forward config: upstream appserver-1 { server unix:/var/www/example.com/app/tmp/gunicorn.sock fail_timeout=0; } server { listen 80; server_name example.com; location / { proxy_pass http://appserver-1; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; auth_basic "Restricted"; auth_basic_user_file /path/to/htpasswd; } location /api/ { auth_basic off; } } The goal is to use basic auth on the whole website, except on the /api/ subtree. While it does work with respect to basic auth, other directives like proxy_pass are not in effect on /api/ as well. Is it possible to just disable basic auth while retaining the other directives without copy&pasting everything?

    Read the article

  • Proper application shutdown before windows xp auto shutdown

    - by vashman
    I frequently leave the computer on playing a movie or downloading a file while I go to bed. I do use the 'shutdown computer when finished' feature of KMPlayer or getright or uTorrent or whatever program I am using. This method effectively shuts down the computer, but the problem is that there are some applications that seem to exit forcefully when doing this kind of shutdown, this being clearly reflected in winamp not saving the current playlist and config, messenger not saving the chat logs, etc. My goal here would be to have automatically close properly all applications when the auto/scheduled program triggers it. I am looking for some Windows shutdown mode/setting that does application closing like the user would do. I am not expecting to auto-click on save dialogs prompts, if this is needed I will do it before leaving the computer on for auto shutdown.

    Read the article

  • Assigning multiple IPv6 addresses on a Server

    - by andrewk
    Let me uncover my intent. My host provides hundreds of IPV6 addresses free, but charge for an IPV4 address. I have several sites under one server and I was wondering if I can give each site/domain it's own ipv6 address. Is that even possible? If so how? I've read quite a bit about ipv6 but I do not understand it as clear as I'd like. My main goal is, for each domain/site to have it's own unique IP, so someone can't do a reverse ip look up and see what sites I have on that server. Thanks in advance for the patience.

    Read the article

  • Best practice for authenticating DMZ against AD in LAN

    - by Sergei
    We have few customer facing servers in DMZ that also have user accounts , all accounts are in shadow password file. I am trying to consolidate user logons and thinking about letting LAN users to authenticate against Active Directory.Services needing authentication are Apache, Proftpd and ssh. After consulting security team I have setup authentication DMZ that has LDAPS proxy that in turn contacts another LDAPS proxy (proxy2) in LAN and this one passes authentication info via LDAP (as LDAP bind) to AD controller.Second LDAP proxy only needed because AD server refuses speak TLS with our secure LDAP implemetation. This works for Apache using appropriate module.At a later stage I may try to move customer accounts from servers to LDAP proxy so they are not scattered around servers. For SSH I joined proxy2 to Windows domain so users can logon using their windows credentials.Then I created ssh keys and copied them to DMZ servers using ssh-copy, to enable passwordless logon once users are authenticated. Is this a good way to implement this kind of SSO?Did I miss any security issues here or maybe there is a better way ofachieving my goal?

    Read the article

  • Security Resources

    - by dr.pooter
    What types of sites do you visit, on a regular basis, to stay current on information security issues? Some examples from my list include: http://isc.sans.org/ http://www.kaspersky.com/viruswatch3 http://www.schneier.com/blog/ http://blog.fireeye.com/research/ As well as following the security heavyweights on twitter. I'm curious to hear what resources you recommend for daily monitoring. Anything specific to particular operating systems or other software. Are mailing lists still considered valuable. My goal would be to trim the cruft of all the things I'm currently subscribed to and focus on the essentials.

    Read the article

  • win2008 r2 IIS7.5 - setting up a custom user for an application pool, and trust issues

    - by Ken Egozi
    Scenario: blank win2008 r2 install the goal was to have a couple of sites running with isolated pool and dedicated users A new folder for a new website - c:\web\siteA\wwwroot, with the app (asp.net) deployed there in the /bin folder created a user named "appuser" and added it to the IIS_USERS group gave the website folder read and execute permissions for IIS_USERS and the appuser created the IIS site. set the app=pool identity to the appuser now I'm getting YSOD telling me that the trust-level is too low - SecurityException: That assembly does not allow partially trusted callers Added <trust level="Full" /> on the web-config, did not help changing the app-pool user to Administrator makes the site run Setting "anonymous user identity" to either IUSR or the app pool identity makes no difference any idea? is there a "step by step" howto guide for setting up users for isolated app pools on IIS7.5?

    Read the article

  • Software for monitoring internal software?

    - by Tyler Eaves
    Is there any good software for monitoring the health of a collection of related software? Requirements are as follows: Web-based, deployable on standard Linux/BSD software. Configurable to support a variety of processes, scheduled at various intervals. Some sort of dashboard interface, for monitoring status, viewing errors, etc. As an example, suppose we have a daily export that's scheduled to run at 6AM each morning. After the export completes, it would POST a status message, saying it had completed, passing in some sort of application key to identify the export. If that status message hadn't come in, by, say, 6:30AM, an e-mail might be sent, that application should go red on the dashboard, etc. Applications should also be able to post error/warning messages. Basically the goal is to be able to monitor all of our internal projects from one system, rather than a multitude of e-mails, log files, etc. I suspect that I'll probably have to end up writing this from scratch, but I just thought I'd ask.

    Read the article

  • KVM Switch for multiple monitors possible?

    - by jasondavis
    I have been curious about this for YEARS! I have never used a KVM switch (keyboard, mouse, monitor) switch to allow you to use 1 keyboard, mouse, monitor to operate 2 or more computers however I understand the concept and what they are for, just never really had a huge need for 1 until now. Here is my issue though. If possible I need to have a KVM switch hooked up to 2 computers in my room. The catch, I have 2-3 monitors on the PC, so I am looking for a way to have a KVM switch that will support multiple monitors. So PC #1 can be using a mouse, keyboard, 2-3 monitors like normal. I then hit the switch and it makes my mouse, keyboard, and 2-3 monitors switch to PC #2. Has anyone ever heard of a KVM switch being able to support multiple monitors like I described? Or how I can achieve this goal? Please help.

    Read the article

  • NIC interface names in /proc/interrupts

    - by Gallus
    When I look at /proc/interrupts: $ cat /proc/interrupts CPU0 CPU1 (...) 12: 4 0 IO-APIC-edge i8042 14: 145 65310875 IO-APIC-edge ide0 50: 0 0 IO-APIC-level uhci_hcd:usb5, Intel ICH7 58: 5388 7983508 IO-APIC-level libata 169: 812427252 1236572641 IO-APIC-level skge, eth1 217: 6 0 IO-APIC-level ehci_hcd:usb1, uhci_hcd:usb2 225: 0 0 IO-APIC-level uhci_hcd:usb3 233: 60 3108720778 IO-APIC-level uhci_hcd:usb4, skge I can see two skge and one eth1 entries. All of them are the network cards. Because of the general name "skge" (which is the name of the network driver of the card) I can't easily reocognize, which NIC occupies which interrupt. How to make linux use more descriptive names in the entries? Or: Is there any alternative way to obtain INT information instead of /proc/interrupt? My final goal is to manipulate smp_affinities of the NICs.

    Read the article

  • Activate Windows XP in VMWare?

    - by Zeno
    Running Windows XP in VMWare, how should activation be handled? Last time I did this, I do not remember having to do any activation. Now it is asking me to activate Windows XP. Am I suppose to buy and use a real WinXP key? What is this about a "Windows XP Mode" I'm hearing about that you can enable in VMware to activate it? The availability of Windows XP Mode is equivalent to a free license to Windows XP operating system. My host machine runs Win7. My goal is to get WinXP running in VMWare the simplest way, but this activation is giving me issues (such as that WinXP doesn't seem to be sold anymore). I Googled around and found that VMWare has a xpmode.enabled = "TRUE" feature which instantly activates Windows XP (with limits, like only once), but I do not know anything about it or how to use it.

    Read the article

  • Failover tmpfs mirroring. Am I doing it right?

    - by user45286
    My goal is to have a certain directory to be available as tmpfs. There will be some modifications during server uptime in this dir and those modifications must be synced to non-tmpfs persistent dir on HDD over rsync. After server boot the latest version from non-tmpfs persistent dir must be moved to tmpfs and rsync syncing to be started. I'm afraid that rsync will erase non-tmpfs backup if tmpfs dir will be empty.. I'm doing it in this way right now: create tmpfs partition in /etc/fstab cat /etc/rc.local (pseudocode) delete "tmpfs rsync" cronjob from /var/spool/cron/crontabs if there is any cp -r /path/to/non-tmpfs-backup /path/to/tmpfs/dir append /var/spool/cron/crontabs with "tmpfs rsync" cronjob What do you think?

    Read the article

  • FTP transfer hangs for random files

    - by hoffmandirt
    I've been stuck on this FTP issue for a while now. I have IIS 7 setup with an IIS 6 FTP server running on a Windows Server 2008 box. The problem I am running into is that I can't download certain files from the FTP server, even though I uploaded those files to the FTP server. The connection times out after 120 seconds. I have used Wireshark and checked the log files. The only message I see is the timeout message. The first thing that came to my mind was permission issues, however I have probably tried every combination of permissions that I can think of, with the end goal of getting the permissions to be the same for the files that work and the files that do not work. With the list of files I have now, I can download the zip, war, and msi files, but not the txt or sql files. It almost seems like a binary thing, but I've changed my transfer mode on the FTP client and also toggled the Active/Passive options around.

    Read the article

  • Which Single Source Publishing tools and strategies are available?

    - by Another Registered User
    I'm about to write a 1000-Pages Documentation about a huge programming framework. The goal is to bring this documentation online into an web platform, so that online users can search through it and read it online. At the same time, the text has to be made public in PDF format for download. And at the same time, the whole thing needs to go into a printed book as well (print on demand, they want a giant PDF file with the whole book). The PDF files: The whole content is divided into several chapters. Every chapter will be available as a standalone PDF eBook. And finally, all chapters will be available in one huge printed book. Is LaTeX capable for something like that? Can it be used for Single Source Publishing? Or would I have to take a look at other technologies like DocBook, etc.?

    Read the article

  • Open Google Chrome Specific Profile From Command Line Mac

    - by gradedcatfood
    I have been trying to open Google Chrome from command line but with no luck! I have tried How do I start Chrome using a specified "user profile"? My goal is to open Google Chrome with a specific profile such as "profile 1", "profile 2", or "Default" from the command line, using bash to be specific, on my Mac. UPDATE: 6/3/14 Got this to work BUT only works when opening chrome for the first time open -a Google\ Chrome --args --"profile-directory"="Profile 1" So How do you get --args to be accepted AFTER google chrome as already been launched??

    Read the article

  • Understanding Zabbix Triggers

    - by Mediocre Gopher
    I have zabbix set with an item to monitor a log file on a zabbix client: log["/var/log/program_name/client.log","ERROR:","UTF-8",100] And a trigger to determine when that log file get's more ERRORs: {Template_Linux:log["/var/log/program_name/client.log","ERROR:","UTF-8",100].change(0)}#0 This trigger gets tripped when the log file gets ERRORs the first time, but then that first trigger just sits around for ever in Monitoring-Triggers. My understanding is that the next time the server checks the value of log["/var/log/program_name/client.log","ERROR:","UTF-8",100] and sees that it hasn't changed that the trigger would go away. Obviously this isn't the case. Could someone explain why this first trigger isn't going away? Ultimately my goal is to receive an email whenever ERRORs are added to that log file, but I would like to understand how triggers are working first.

    Read the article

  • Recover an HP recovery partition

    - by eric.chartier
    I have a (semi)-dead hard drive with an HP recovery partition on it. My goal is to Buy a new hard drive Copy the recovery partition to a drive ( dd if=/dev/sdb1 of=~/recovery.bak ) Make a new partition of 12000 mb with Windows 7 Copy back recovery partition to the new drive ( dd if=~/recovery.bak of=/dev/sdb1 ) Then press F11 when the laptop boots. However, this doesn't work. Any idea why? Edit: I suspect the F11 doesn't work because the laptop tries to boot the laptop, because my partition is the primary partition of the drive. Does anyone have any experience dealing with stuff like this?

    Read the article

  • Postfix configuration problem

    - by dhanya
    Can anyone help me by giving your postfix configuration file as a reference so that I can find my mistakes? I'm working on SUSE Linux Enterprise Server. My goal is to set up a mailserver in a campus network. Postfix shows it is running but no mail is sent to var/spool/mail I send mail using mail command at terminal. Here is my main.cf file, please help me finding a solution: readme_directory = /usr/share/doc/packages/postfix-doc/README_FILES inet_protocols = all biff = no mail_spool_directory = /var/mail canonical_maps = hash:/etc/postfix/canonical virtual_alias_maps = hash:/etc/postfix/virtual virtual_alias_domains = hash:/etc/postfix/virtual relocated_maps = hash:/etc/postfix/relocated transport_maps = hash:/etc/postfix/transport sender_canonical_maps = hash:/etc/postfix/sender_canonical masquerade_exceptions = root masquerade_classes = envelope_sender, header_sender, header_recipient myhostname = cmail.cetmail delay_warning_time = 1h message_strip_characters = \0 program_directory = /usr/lib/postfix inet_interfaces = all #inet_interfaces = 127.0.0.1 masquerade_domains = cetmail mydestination = cmail.cetmail, localhost.cetmail, cetmail defer_transports = mynetworks_style = subnet disable_dns_lookups = no relayhost = postfix mailbox_command = cyrus mailbox_transport = strict_8bitmime = no disable_mime_output_conversion = no smtpd_sender_restrictions = hash:/etc/postfix/access smtpd_client_restrictions = smtpd_helo_required = no smtpd_helo_restrictions = strict_rfc821_envelopes = no smtpd_recipient_restrictions = permit_mynetworks,reject_unauth_destination smtp_sasl_auth_enable = no smtpd_sasl_auth_enable = no smtpd_use_tls = no smtp_use_tls = no alias_maps = hash:/etc/aliases mailbox_size_limit = 0 message_size_limit = 10240000

    Read the article

  • Top ten security tips for non-technical users

    - by Justin
    I'm giving a presentation later this week to the staff at the company where I work. The goal of the presentation is to serve as a refresher/remidner of good practices that can help keep our network secure. The audience is made up of both programmers and non-technical staff, so the presentation is geared for non-technical users. I want part of this presentation to be a top list of "tips". The list needs to be short (to encourage memory) and be specific and relevant to the user. I have the following five items so far: Never open an attachment you didn't expect Only download software from a trusted source, like download.com Do not distribute passwords when requested via phone or email Be wary of social engineering Do not store sensitive data on an FTP server Some clarifications: This is for our work network These need to be "best practices" tips for the end-user, not IT policy We have backups, OS patches, firewall, AV, etc, all centrally managed This is for a small business (less than 25 people) I have two questions: Do you suggest any additional items? Do you suggest any changes to existing items?

    Read the article

  • What's the simplest way to serve RhodeCode over HTTPS on windows?

    - by Keith Nicholas
    I have RhodeCode working with http using the paster serve tool that it comes with... I'm struggling to find a "simple" solution to get this running on HTTPS. A lot of discussion is about using Apache to do this on unix. Not a lot of info on how to do it on IIS. I was looking at paster serve and it seems to be able to serve using HTTPS, but can't quite work out how to get this going. However the real goal is just to serve RhodeCode over HTTPS in the simplest way possible ( all self contained would be brilliant).

    Read the article

  • maven sonar problem

    - by senzacionale
    I want to use sonar for analysis but i can't get any data in localhost:9000 <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <artifactId>KIS</artifactId> <groupId>KIS</groupId> <version>1.0</version> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-antrun-plugin</artifactId> <version>1.4</version> <executions> <execution> <id>compile</id> <phase>compile</phase> <configuration> <tasks> <property name="compile_classpath" refid="maven.compile.classpath"/> <property name="runtime_classpath" refid="maven.runtime.classpath"/> <property name="test_classpath" refid="maven.test.classpath"/> <property name="plugin_classpath" refid="maven.plugin.classpath"/> <ant antfile="${basedir}/build.xml"> <target name="maven-compile"/> </ant> </tasks> </configuration> <goals> <goal>run</goal> </goals> </execution> </executions> </plugin> </plugins> </build> </project> output when running sonar: jar file is empty [INFO] Executed tasks [INFO] [resources:testResources {execution: default-testResources}] [WARNING] Using platform encoding (Cp1250 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] skip non existing resourceDirectory J:\ostalo_6i\KIS deploy\ANT\src\test\resources [INFO] [compiler:testCompile {execution: default-testCompile}] [INFO] No sources to compile [INFO] [surefire:test {execution: default-test}] [INFO] No tests to run. [INFO] [jar:jar {execution: default-jar}] [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: J:\ostalo_6i\KIS deploy\ANT\target\KIS-1.0.jar [INFO] [install:install {execution: default-install}] [INFO] Installing J:\ostalo_6i\KIS deploy\ANT\target\KIS-1.0.jar to C:\Documents and Settings\MitjaG\.m2\repository\KIS\KIS\1.0\KIS-1.0.jar [INFO] ------------------------------------------------------------------------ [INFO] Building Unnamed - KIS:KIS:jar:1.0 [INFO] task-segment: [sonar:sonar] (aggregator-style) [INFO] ------------------------------------------------------------------------ [INFO] [sonar:sonar {execution: default-cli}] [INFO] Sonar host: http://localhost:9000 [INFO] Sonar version: 2.1.2 [INFO] [sonar-core:internal {execution: default-internal}] [INFO] Database dialect class org.sonar.api.database.dialect.Oracle [INFO] ------------- Analyzing Unnamed - KIS:KIS:jar:1.0 [INFO] Selected quality profile : KIS, language=java [INFO] Configure maven plugins... [INFO] Sensor SquidSensor... [INFO] Sensor SquidSensor done: 16 ms [INFO] Sensor JavaSourceImporter... [INFO] Sensor JavaSourceImporter done: 0 ms [INFO] Sensor AsynchronousMeasuresSensor... [INFO] Sensor AsynchronousMeasuresSensor done: 15 ms [INFO] Sensor SurefireSensor... [INFO] parsing J:\ostalo_6i\KIS deploy\ANT\target\surefire-reports [INFO] Sensor SurefireSensor done: 47 ms [INFO] Sensor ProfileSensor... [INFO] Sensor ProfileSensor done: 16 ms [INFO] Sensor ProjectLinksSensor... [INFO] Sensor ProjectLinksSensor done: 0 ms [INFO] Sensor VersionEventsSensor... [INFO] Sensor VersionEventsSensor done: 31 ms [INFO] Sensor CpdSensor... [INFO] Sensor CpdSensor done: 0 ms [INFO] Sensor Maven dependencies... [INFO] Sensor Maven dependencies done: 16 ms [INFO] Execute decorators... [INFO] ANALYSIS SUCCESSFUL, you can browse http://localhost:9000 [INFO] Database optimization... [INFO] Database optimization done: 172 ms [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESSFUL [INFO] ------------------------------------------------------------------------ [INFO] Total time: 6 minutes 16 seconds [INFO] Finished at: Fri Jun 11 08:28:26 CEST 2010 [INFO] Final Memory: 24M/43M [INFO] ------------------------------------------------------------------------ any idea why, i successfully compile with maven ant plugin java project.

    Read the article

  • Maven doesn't see my <repository> in <dependencyManagement>

    - by Ondra Žižka
    To make Maven "deploy" to a directory, I use this: <distributionManagement> <downloadUrl>http://code.google.com/p/junitdiff/downloads/list</downloadUrl> <repository> <id>local-hack-repo</id> <name>LocalDir</name> <url>file://${project.basedir}/dist-maven</url> </repository> <snapshotRepository> <id>jboss-snapshots-repository</id> <name>JBoss Snapshots Repository</name> <!-- <url>https://repository.jboss.org/nexus/content/repositories/snapshots</url> --> <url>file://${project.basedir}/dist-maven</url> </snapshotRepository> </distributionManagement> This appears in the efffective pom. ... <distributionManagement> <repository> <id>local-hack-repo</id> <name>LocalDir</name> <url>file:///home/ondra/work/TOOLS/JUnitDiff/github/dist-maven</url> </repository> <snapshotRepository> <id>jboss-snapshots-repository</id> <name>JBoss Snapshots Repository</name> <url>file:///home/ondra/work/TOOLS/JUnitDiff/github/dist-maven</url> </snapshotRepository> <downloadUrl>http://code.google.com/p/junitdiff/downloads/list</downloadUrl> </distributionManagement> But still, Maven insists that it's not there: [INFO] [ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy (default-deploy) on project JUnitDiff: Deployment failed: repository element was not specified in the POM inside distributionManagement element or in -DaltDeploymentRepository=id::layout::url parameter -> [Help 1] [INFO] org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy (default-deploy) on project JUnitDiff: Deployment failed: repository element was not specified in the POM inside distributionManagement element or in -DaltDeploymentRepository=id::layout::url parameter [INFO] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217) [INFO] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153) [INFO] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145) [INFO] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84) [INFO] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59) [INFO] at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183) [INFO] at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161) [INFO] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320) [INFO] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156) [INFO] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537) [INFO] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196) [INFO] at org.apache.maven.cli.MavenCli.main(MavenCli.java:141) [INFO] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [INFO] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) [INFO] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [INFO] at java.lang.reflect.Method.invoke(Method.java:601) [INFO] at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290) [INFO] at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230) [INFO] at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409) [INFO] at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352) [INFO] Caused by: org.apache.maven.plugin.MojoExecutionException: Deployment failed: repository element was not specified in the POM inside distributionManagement element or in -DaltDeploymentRepository=id::layout::url parameter [INFO] at org.apache.maven.plugin.deploy.DeployMojo.getDeploymentRepository(DeployMojo.java:235) [INFO] at org.apache.maven.plugin.deploy.DeployMojo.execute(DeployMojo.java:118) [INFO] at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101) [INFO] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209) [INFO] ... 19 more I am using it through the maven-release-plugin. What's wrong?

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >