Search Results

Search found 5121 results on 205 pages for 'dotnetnuke connections'.

Page 45/205 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • How to make Jetty webserver listen on port 80?

    - by Jonas
    I would like to use Jetty as a webserver. I have edited the configuration file at /etc/default/jetty and set: # change to 0 to allow Jetty start NO_START=0 # Listen to connections from this network host # Use 0.0.0.0 as host to accept all connections. JETTY_HOST=0.0.0.0 Now I can reach the Jetty webserver at http://192.168.1.10:8080 but I would like to have Jetty listening on port 80. I have tried this setting in the same configuration file: # The network port used by Jetty JETTY_PORT=80 and then restart Jetty with sudo service jetty restart but it doesn't work. How can I change so that the Jetty webserver is listening on port 80?

    Read the article

  • Using SMO to drop a SQL Database

    - by ybbest
    SQL Server Management Objects(SMO) is the API you can use to manipulate the sql server,like create databse and delete database. To get more details you can check the msdn documentation. There are 2 ways you can drop a database 1. You could create a Database object and call Drop method: Dim database As Database = New Database(Your database name) database.Drop() 2.However if you have existing connections to the database ,attempting to drop it using the above method will fail.Recall that when you try to drop the database from management studio ,you can tick the check box to close all the connections before drop the database.It is not so obvious , but you can do the exact same thing using SMO: Dim server As Server= New Server(ServerConn) server.KillAllProcesses(Your database name) server.KillDatabase(Your database name)

    Read the article

  • WCF Keep Alive: Whether to disable keepAliveEnabled

    - by Lijo
    I have a WCF web service hosted in a load balanced environment. I do not need any WCF session related functionality in the service. QUESTION What are the scenarios in which performances will be best if keepAliveEnabled = false keepAliveEnabled = true Reference From Load Balancing By default, the BasicHttpBinding sends a connection HTTP header in messages with a Keep-Alive value, which enables clients to establish persistent connections to the services that support them. This configuration offers enhanced throughput because previously established connections can be reused to send subsequent messages to the same server. However, connection reuse may cause clients to become strongly associated to a specific server within the load-balanced farm, which reduces the effectiveness of round-robin load balancing. If this behavior is undesirable, HTTP Keep-Alive can be disabled on the server using the KeepAliveEnabled property with a CustomBinding or user-defined Binding.

    Read the article

  • netbook alternate installation/update

    - by user11847
    I have an Asipre One D255E netbook. Installed 9.10 sucessfully, but no internet connections to upgrade to 10.04 or 10.10. Have 10.10 alternate (couldnt get 10.04). However it says that no cd-rom present (netbook via live usb), and I directed it to sdb1 but that did not work. Could someone guide me to the steps to installation via alternate ubs only (& no internet)? The live usb's of 10.04 & 10.10 internet connections worked, but installation hanged (non alternate). Thank you greatly in advance.

    Read the article

  • Creating the Business Card Request InfoPath Form

    - by JKenderdine
    Business Card Request Demo Files Back in January I spoke at SharePoint Saturday Virginia Beach about InfoPath forms and Web Part deployment.  Below is some of the information and details regarding the form I created for the session.  There are many blogs and Microsoft articles on how to create a basic form so I won’t repeat that information here.   This blog will just explain a few of the options I chose when creating the solutions for SPS Virginia Beach.  The above link contains the zipped package files of the two InfoPath forms(no code solution and coded solution), the list template for the Location list I used, and the PowerPoint deck.  If you plan to use these templates, you will need to update the forms to work within your own environments (change data connections, code links, etc.).  Also, you must have the SharePoint Enterprise version, with InfoPath Services configured in order to use the Web Browser enabled forms. So what are the requirements for this template? Business Card Request Form Template Design Plan: Gather user information and requirements for card Pull in as much user information as possible. Use data from the user profile web services as a data source Show and hide fields as necessary for requirements Create multiple views – one for those submitting the form and Another view for the executive assistants placing the orders. Browser based form integrated into SharePoint team site Submitted directly to form library The base form was created using the blank template.  The table and rows were added using Insert tab and selecting Custom Table.  The use of tables is a great way to make sure everything lines up.  You do have to split the tables from time to time.  If you’ve ever split cells and then tried to re-align one to find that you impacted the others, you know why.  Here is what the base form looks like in InfoPath.   Show and hide fields as necessary for requirements You will notice I also used Sections within the form.  These show or hide depending on options selected or whether or not fields are blank.  This is a great way to prevent your users from feeling overwhelmed with a large form (this one wouldn’t apply).  Although not used in this one, you can also use various views with a tab interface.  I’ll show that in another post. Gather user information and requirements for card Pull in as much user information as possible. Use data from the user profile web services as a data source Utilizing rules you can load data when the form initiates (Data tab, Form Load).  Anything you can automate is always appreciated by the user as that is data they don’t have to enter.  For example, loading their user id or other user information on load: Always keep in mind though how much data you load and the method for loading that data (through rules, code, etc.).  They have an impact on form performance.  The form will take longer to load if you bring in a ton of data from external sources.  Laura Rogers has a great blog post on using the User Information List to load user information.   If the user has logged into SharePoint, then this can be used quite effectively and without a huge performance hit.   What I have found is that using the User Profile service via code behind or the Web Service “GetUserProfileByName” (as above) can take more time to load the user data.  Just food for thought. You must add the data connection in order for the above rules to work.  You can connect to the data connection through the Data tab, Data Connections or select Manage Data Connections link which appears under the main data source.  The data connections can be SharePoint lists or libraries, SQL data tables, XML files, etc.  Create multiple views – one for those submitting the form and Another view for the executive assistants placing the orders. You can also create multiple views for the users to enhance their experience.  Once they’ve entered the information and submitted their request for business cards, they don’t really need to see the main data input screen any more.  They just need to view what they entered. From the Page Design tab, select New View and give the view a name.  To review the existing views, click the down arrow under View: The ReviewView shows just what the user needs and nothing more: Once you have everything configured, the form should be tested within a Test SharePoint environment before final deployment to production.  This validates you don’t have any rules or code that could impact the server negatively. Submitted directly to form library   You will need to know the form library that you will be submitting to when publishing the template.  Configure the Submit data connection to connect to this library.  There is already one configured in the sample,  but it will need to be updated to your environment prior to publishing. The Design template is different from the Published template.  While both have the .XSN extension, the published template contains all the “package” information for the form.  The published form is what is loaded into Central Admin, not the design template. Browser based form integrated into SharePoint team site In Central Admin, under General Settings, select Manage Form Templates.  Upload the published form template and Activate it to a site collection. Now it is available as a content type to select in the form library.  Some documentation on publishing form templates:  Technet – Manage administrator approved form templates And that’s all our base requirements.  Hope this helps to give a good start.

    Read the article

  • Updating a database connection password using a script

    - by Tim Dexter
    An interesting customer requirement that I thought was worthy of sharing today. Thanks to James for the requirement and Bryan for the proposed solution and me for testing the solution and proving it works :0) A customers implementation of Sarbanes Oxley requires them to change all database account passwords every 90 days. This is scripted leveraging shell scripts today for most of their environments. But how can they manage the BI Publisher connections? Now, the customer is running 11g and therefore using weblogic on the middle tier, which is the first clue to Bryans proposed solution. To paraphrase and embellish Bryan's solution a little; why not use a JNDI connection from BIP to the database. Then employ the web logic scripting engine to make updates to the JNDI as needed? BIP is completely uninvolved and with a little 'timing' users will be completely unaware of the password updates i.e. change the password when reports are not being executed. Perfect! James immediately tracked down the WLST script that could be used here, http://middlewaremagic.com/weblogic/?p=4261 (thanks Ravish) Now it was just a case of testing the theory. Some steps: Create the JNDI connection in WLS Create the JNDI connection in BI Publisher pointing to the WLS connection Build new data models using or re-point data sources to use the JNDI connection. Create the WLST script to update the WLS JNDI password as needed. Test! Some details. Creating the JNDI connection in web logic is pretty straightforward. Log into hte console and look for Data Sources under the Services section of the home page and click it Click New >> Generic Datasource Give the connection a name. For the JNDI name, prefix it with 'jdbc/' so I have 'jdbc/localdb' - this name is important you'll need it on the BIP side. Select your db type - this will influence the drivers and information needed on the next page. Being a company man, Im using an Oracle db. Click Next Select the driver of choice, theres lots I know, you can read about them I just chose 'Oracle's Driver (Thin) for Instance connections; Versions 9.0.1 and later' Click Next >> Next Fill out the db name (SID), server, port, username to connect and password >> Next Test the config to ensure you can connect. >> Next Now you need to deploy the connection to your BI server, select it and click Next. You're done with the JNDI config. Creating the JNDI connection on the Publisher side is covered here. Just remember to the connection name you created in WLS e.g. 'jdbc/localdb' Not gonna tell you how to do this, go read the user guide :0) Suffice to say, it works. This requires a little reading around the subject to understand the scripting engine and how to execute scripts. Nicely covered here. However a bit of googlin' and I found an even easier way of running the script. ${ServerHome}/common/bin/wlst.sh updatepwd.py Where updatepwd.py is my script file, it can be in another directory. As part of the wlst.sh script your environment is set up for you so its very simple to execute. The nitty gritty: Need to take Ravish's script above and create a file with a .py extension. Its going to need some modification, as he explains on the web page, to make it work in your environment. I played around with it for a while but kept running into errors. The script as is, tries to loop through all of your connections and modify the user and passwords for each. Not quite what we are looking for. Remember our requirement is to just update the password for a given connection. I also found another issue with the script. WLS 10.x does not allow updates to passwords using clear type ie un-encrypted text while the server is in production mode. Its a bit much to set it back to developer mode bounce it, change the passwords and then bounce and then change back to production and bounce again. After lots of messing about I finally came up with the following: ############################################################################# # # Update password for JNDI connections # ############################################################################# print("*** Trying to Connect.... *****") connect('weblogic','welcome1','t3://localhost:7001') print("*** Connected *****") edit() startEdit() print ("*** Encrypt the password ***") en = encrypt('hr') print "Encrypted pwd: ", en print ("*** Changing pwd for LocalDB ***") dsName = 'LocalDB' print 'Changing Password for DataSource ', dsName cd('/JDBCSystemResources/'+dsName+'/JDBCResource/'+dsName+'/JDBCDriverParams/'+dsName) set('PasswordEncrypted',en) save() activate() Its pretty simple and you can expand on it to loop through the data sources and change each as needed. I have hardcoded the password into the file but you can pass it as a parameter as needed using the properties file method. Im not going to get into the detail of that here but its covered with an example here. Couple of points to note: 1. The change to the password requires a server bounce to get the changes picked up. You can add that to the shell script you will use to call the script above. 2. The script above needs to be run from the MW_HOME\user_projects\domains\bifoundation_domain directory to get the encryption libraries set correctly. My command to run the whole script was: d:\oracle\bi_mw\wlserver_10.3\common\bin\wlst.cmd updatepwd.py - where wlst.cmd is the scripting command line and updatepwd.py was my update password script above. I have not quite spoon fed everything you need to make it a robust script but at least you know you can do it and you can work out the rest I think :0)

    Read the article

  • Revert F3 and F4 behavior in byobu?

    - by James Zimmerman II
    With a fresh 12.04 installation of Ubuntu, byobu's interpretation of the function keys seems to have changed. F3 and F4 will still change the active tab on the current session, but it changes it for all connections. Previously, if you have two active connections to the same user (same session) with User 1 on shell 0/tab 0 and User 2 on shell 0/tab 0, when User 2 pressed F3 , they would be changed to tab 1 while User 1 would remain on tab 0. With the latest version in 12.04, when User 2 presses F3, all connected users are moved to shell 1/tab 1. Is there any way to re-enable the old behavior? I have looked at the /etc/byobu directory and only found backend and socketdir files there. Toggling backend between tmux and screen did not seem to render any difference. Anyone know what, if any, configuration option exist to control this?

    Read the article

  • How to control an Ubuntu PC from another Ubuntu PC over Internet?

    - by Naveen
    There are two Ubuntu PCs called A and B. A and B are connected to the Internet using two separated Internet connections. (In my case, two mobile broadband connections ppp0 x2 ) Each connection has a unique & static public IP address. What I need is to control A computer's cursor, using B computer's mouse, over the Internet. In both computers, I have allowed other users to control my computer in Desktop Sharing preferences, as below: When I try to connect to A from the B using Remmina Remote Desktop Client, it refuses to connect after trying for a while. These are my settings: I expect this to be done from an available open source software, not from TeamViewer. I found this guide harder to understand. Please provide me clear instructions... Thanks for having a look!

    Read the article

  • Connectify Dispatch Links Multiple Network Nodes Into a Mega Connection

    - by Jason Fitzpatrick
    Connectify Dispatch wants to change the way you interact with the networks around you by making it dead simple to mesh all available Wi-Fi, Cellular, and Ethernet connections into a massive and stable pipeline. Dispatch makes it open-and-click easy to hook up multiple Wi-Fi nodes, your cellphone, and even Ethernet connections into a single blended connection. While the video above gives a great overview of the process, check out the video below to see it in real world action: The project is currently in the last phase of KickStarter funding, so now is a great time to score Connectify Dispatch at a steep discount–pledging as little as $10 to fund the project, for example, scores you 50% of a 6-month Pro license. Hit up the link below to read more about the project, check the KickStarter status, and see all the neat features in the development pipeline. Dispatch: The Internet, Faster. [KickStarter] HTG Explains: What The Windows Event Viewer Is and How You Can Use It HTG Explains: How Windows Uses The Task Scheduler for System Tasks HTG Explains: Why Do Hard Drives Show the Wrong Capacity in Windows?

    Read the article

  • HttpClient multithread performance

    - by pepper
    I have an application which downloads more than 4500 html pages from 62 target hosts using HttpClient (4.1.3 or 4.2-beta). It runs on Windows 7 64-bit. Processor - Core i7 2600K. Network bandwidth - 54 Mb/s. At this moment it uses such parameters: DefaultHttpClient and PoolingClientConnectionManager; Also it hasIdleConnectionMonitorThread from http://hc.apache.org/httpcomponents-client-ga/tutorial/html/connmgmt.html; Maximum total connections = 80; Default maximum connections per route = 5; For thread management it uses ForkJoinPool with the parallelism level = 5 (Do I understand correctly that it is a number of working threads?) In this case my network usage (in Windows task manager) does not rise above 2.5%. To download 4500 pages it takes 70 minutes. And in HttpClient logs I have such things: DEBUG ForkJoinPool-2-worker-1 [org.apache.http.impl.conn.PoolingClientConnectionManager]: Connection released: [id: 209][route: {}-http://stackoverflow.com][total kept alive: 6; route allocated: 1 of 5; total allocated: 10 of 80] Total allocated connections do not raise above 10-12, in spite of that I've set it up to 80 connections. If I'll try to rise parallelism level to 20 or 80, network usage remains the same but a lot connection time-outs will be generated. I've read tutorials on hc.apache.org (HttpClient Performance Optimization Guide and HttpClient Threading Guide) but they does not help. Task's code looks like this: public class ContentDownloader extends RecursiveAction { private final HttpClient httpClient; private final HttpContext context; private List<Entry> entries; public ContentDownloader(HttpClient httpClient, List<Entry> entries){ this.httpClient = httpClient; context = new BasicHttpContext(); this.entries = entries; } private void computeDirectly(Entry entry){ final HttpGet get = new HttpGet(entry.getLink()); try { HttpResponse response = httpClient.execute(get, context); int statusCode = response.getStatusLine().getStatusCode(); if ( (statusCode >= 400) && (statusCode <= 600) ) { logger.error("Couldn't get content from " + get.getURI().toString() + "\n" + response.toString()); } else { HttpEntity entity = response.getEntity(); if (entity != null) { String htmlContent = EntityUtils.toString(entity).trim(); entry.setHtml(htmlContent); EntityUtils.consumeQuietly(entity); } } } catch (Exception e) { } finally { get.releaseConnection(); } } @Override protected void compute() { if (entries.size() <= 1){ computeDirectly(entries.get(0)); return; } int split = entries.size() / 2; invokeAll(new ContentDownloader(httpClient, entries.subList(0, split)), new ContentDownloader(httpClient, entries.subList(split, entries.size()))); } } And the question is - what is the best practice to use multi threaded HttpClient, may be there is a some rules for setting up ConnectionManager and HttpClient? How can I use all of 80 connections and raise network usage? If necessary, I will provide more code.

    Read the article

  • Changing internal home network ip address for connected devices

    - by oshirowanen
    I have a few computers at home. For each of the computers, I can see the internal home network ip address on any given device by typing in ifconfig in the terminal. If the device is connected to the home network via ethernet connection or via the built-in wireless connections in laptops, the internal ip address for each of the devices seems to be 192.168.0.X. However, when I connect one of the devices using an external usb modem wireless adapter, which connects to the home network through wireless, when I check the ip address via ifconfig, for some reason it gets assigned 192.168.42.X instead. Why are the ethernet and build in wireless connections getting 192.168.0.X, but the external usb wireless adapter gets 192.168.42.X? Most importantly, is it possible to force it to get an internal ip address of 192.168.0.X?

    Read the article

  • What is So Unique About Node.js?

    - by Adrian Shum
    Recently there has been a lot of praise for Node.js. I am not a developer that has had much exposure to network application. From my bare understanding of Nodes.js, its strength is: we have only one thread handling multiple connections, providing an event-based architecture. However, for example in Java, I can create only one thread using NIO/AIO (which is non-blocking APIs from my bare understanding), and handle multiple connections using that thread, and I provide an event-based architecture to implement the data handling logic (shouldn't be that difficult by providing some callback etc) ? Given JVM being a even more mature VM than V8 (I expect it to run faster too), and event-based handling architecture seems to be something not difficult to create, I am not sure why Node.js is attracting so much attention. Did I miss some important points?

    Read the article

  • the best "Simple" CMS system suitable for .Net MVC

    - by Truegilly
    Hello, I'm developing a MVC .Net site and would like to implement a CMS system. So far I have looked at Umbraco, which looks good but the help is poor and the getting started video section is empty, and dotnetnuke, which again looks good but I get the impression its aimed at non developers, plus to has much more than I need and want. In my last job I created a basic custom CMS system that worked by simply adding values into the database, the application called the text from there. it worked fine but I don’t have the source code. what I'm after is a simple CMS system that really just controls text and images, I dont need all the fancy stuff thats in umbraco and dotnetnuke, all the design and CSS i can do myself. also as im working with MVC (which is awesome, and such a breath of fresh air compared to the web forms/telerik crap I have to put up with at work) it needs to be compatible. A simple CMS, can anyone provide any suggestion ? Truegilly

    Read the article

  • Why Nodes.js being that "unique"?

    - by Adrian Shum
    Recently years there are lots of praise to Nodes.js. I am not a developer that have much exposure on network application. From my bare understanding of Nodes.js, its strength is: We are having only on thread handling multiple connections, providing a event-based architecture. However, for example in Java, what if I am having only one thread, using NIO/AIO (which is non-blocking APIs from my bare understanding), and handle multiple connections using that thread, and I provide an event-based architecture to implement the data handling logic (shouldn't be that difficult by providing some callback etc) ? Given JVM being a even more mature VM than V8 (I expect it run faster too), and event-based handling architecture seems not something difficult to create. I am not sure why Nodes.js is attracting so much attention. Did I miss some important points?

    Read the article

  • How to connect to the internet via command line or graphical utility given I can't click at the top of the screen?

    - by Ben
    My Ubuntu 12.04 installation has an input problem resulting preventing me from clicking near the top of the screen. That is, there is an unclickable area, circa 60 pixels high and stretching all the way across the screen. I will ask about this in a later question. For now, I would just like to be able to connect to the internet, given this limitation. I am able to access the network connections application by hitting the super key and typing "network connections". There is a LAN and a wireless network available. The network icon in the bar at the top of the screen shows an empty wedge, which I suppose means that wireless is off. I've attached the cable for the Ethernet connection, but it does not seem to have connected automatically. How can I figure out what's going on? (I'll happily edit in the output of any relevant terminal commands.) Thank you.

    Read the article

  • Data Source Security Part 1

    - by Steve Felts
    I’ve written a couple of articles on how to store data source security credentials using the Oracle wallet.  I plan to write a few articles on the various types of security available to WebLogic Server (WLS) data sources.  There are more options than you might think! There have been several enhancements in this area in WLS 10.3.6.  There are a couple of more enhancements planned for release WLS 12.1.2 that I will include here for completeness.  This isn’t intended as a teaser.  If you call your Oracle support person, you can get them now as minor patches to WLS 10.3.6.   The current security documentation is scattered in a few places, has a few incorrect statements, and is missing a few topics.  It also seems that the knowledge of how to apply some of these features isn’t written down.  The goal of these articles is to talk about WLS data source security in a unified way and to introduce some approaches to using the available features.  Introduction to WebLogic Data Source Security Options By default, you define a single database user and password for a data source.  You can store it in the data source descriptor or make use of the Oracle wallet.  This is a very simple and efficient approach to security.  All of the connections in the connection pool are owned by this user and there is no special processing when a connection is given out.  That is, it’s a homogeneous connection pool and any request can get any connection from a security perspective (there are other aspects like affinity).  Regardless of the end user of the application, all connections in the pool use the same security credentials to access the DBMS.   No additional information is needed when you get a connection because it’s all available from the data source descriptor (or wallet). java.sql.Connection conn =  mydatasource.getConnection(); Note: You can enter the password as a name-value pair in the Properties field (this not permitted for production environments) or you can enter it in the Password field of the data source descriptor. The value in the Password field overrides any password value defined in the Properties passed to the JDBC Driver when creating physical database connections. It is recommended that you use the Password attribute in place of the password property in the properties string because the Password value is encrypted in the configuration file (stored as the password-encrypted attribute in the jdbc-driver-params tag in the module file) and is hidden in the administration console.  The Properties and Password fields are located on the administration console Data Source creation wizard or Data Source Configuration tab. The JDBC API can also be used to programmatically specify a database user name and password as in the following.  java.sql.Connection conn = mydatasource.getConnection(“user”, “password”); According to the JDBC specification, it’s supposed to take a database user and associated password but different vendors implement this differently.  WLS, by default, treats this as an application server user and password.  The pair is authenticated to see if it’s a valid user and that user is used for WLS security permission checks.  By default, the user is then mapped to a database user and password using the data source credential mapper, so this API sort of follows the specification but database credentials are one-step removed from the application code.  More details and the rationale are described later. While the default approach is simple, it does mean that only one database user is doing all of the work.  You can’t figure out who actually did the update and you can’t restrict SQL operations by who is running the operation, at least at the database level.   Any type of per-user logic will need to be in the application code instead of having the database do it.  There are various WLS data source features that can be configured to provide some per-user information about the operations to the database. WebLogic Data Source Security Options This table describes the features available for WebLogic data sources to configure database security credentials and a brief description.  It also captures information about the compatibility of these features with one another. Feature Description Can be used with Can’t be used with User authentication (default) Default getConnection(user, password) behavior – validate the input and use the user/password in the descriptor. Set client identifier Proxy Session, Identity pooling, Use database credentials Use database credentials Instead of using the credential mapper, use the supplied user and password directly. Set client identifier, Proxy session, Identity pooling User authentication, Multi Data Source Set Client Identifier Set a client identifier property associated with the connection (Oracle and DB2 only). Everything Proxy Session Set a light-weight proxy user associated with the connection (Oracle-only). Set client identifier, Use database credentials Identity pooling, User authentication Identity pooling Heterogeneous pool of connections owned by specified users. Set client identifier, Use database credentials Proxy session, User authentication, Labeling, Multi-datasource, Active GridLink Note that all of these features are available with both XA and non-XA drivers. Currently, the Proxy Session and Use Database Credentials options are on the Oracle tab of the Data Source Configuration tab of the administration console (even though the Use Database Credentials feature is not just for Oracle databases – oops).  The rest of the features are on the Identity tab of the Data Source Configuration tab in the administration console (plan on seeing them all in one place in the future). The subsequent articles will describe these features in more detail.  Keep referring back to this table to see the big picture.

    Read the article

  • Ubuntu 12.04 and Realtek Wireless Card Incompatible

    - by Tim
    Quick background: I installed Linux Ubuntu 12.04 on a Dell Inspiron 3520 after uninstalling Windows 8. A friend suggested to do this as opposed to buying a laptop with Linux pre-installed. Everything except the wireless card works. When I load up the laptop, I see a wireless animation as if it's trying to locate the wireless connection (for a while, then it goes blank), but no connections appear under the wireless option (under "Edit Connections" then "Wireless"). What I've tried: - Doing ndiswrapper. Failed. - Installing a few kernels for realtek. Failed.

    Read the article

  • iptables rule problem

    - by thakrage
    I've been searching around for some time now, but nothing solves my problem. I'm setting up a mail server, but when writing to the iptables, I get an error: iptables-restore: line 2 failed. I'm tryig to use the following /etc/iptables.test.rules: # Allows SMTP access -A INPUT -p tcp --dport 25 -j ACCEPT # Allows pop and pops connections -A INPUT -p tcp --dport 110 -j ACCEPT -A INPUT -p tcp --dport 995 -j ACCEPT # Allows imap and imaps connections -A INPUT -p tcp --dport 143 -j ACCEPT -A INPUT -p tcp --dport 993 -j ACCEPT After this, I'm issuing the following command: sudo iptables-restore < /etc/iptables.test.rules However I get returned this: iptables-restore: line 2 failed. I don't know what the problem is. Can anyone clarify? btw. I'm using Ubuntu 10.10 LTS

    Read the article

  • OpenVpn Iptables Error

    - by Mook
    I mean real newbie - linux here.. Please help me configuring my openvpn through iptables. My main goal here is to open port for regular browsing (80, 443), email (110, 25), etc just like isp does but i want to block p2p traffic. So I will need to open only few port. Here are my iptables config # Flush all current rules from iptables # iptables -F iptables -t nat -F iptables -t mangle -F # # Allow SSH connections on tcp port 22 (or whatever port you want to use) # iptables -A INPUT -p tcp --dport 22 -j ACCEPT # # Set default policies for INPUT, FORWARD and OUTPUT chains # iptables -P INPUT DROP #using DROP for INPUT is not always recommended. Change to ACCEPT if you prefer. iptables -P FORWARD ACCEPT iptables -P OUTPUT ACCEPT # # Set access for localhost # iptables -A INPUT -i lo -j ACCEPT # # Accept packets belonging to established and related connections # iptables -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT # #Accept connections on 1194 for vpn access from clients #Take note that the rule says "UDP", and ensure that your OpenVPN server.conf says UDP too # iptables -A INPUT -p udp --dport 1194 -j ACCEPT # #Apply forwarding for OpenVPN Tunneling # iptables -A FORWARD -m state --state RELATED,ESTABLISHED -j ACCEPT iptables -A FORWARD -s 10.8.0.0/24 -j ACCEPT #10.8.0.0 ? Check your OpenVPN server.conf to be sure iptables -A FORWARD -j REJECT iptables -t nat -A POSTROUTING -o venet0 -j SNAT --to-source 100.200.255.256 #Use your OpenVPN server's real external IP here # #Enable forwarding # echo 1 > /proc/sys/net/ipv4/ip_forward iptables -A INPUT -p tcp --dport 25 -j ACCEPT iptables -A INPUT -p tcp --dport 26 -j ACCEPT iptables -A INPUT -p tcp --dport 80 -j ACCEPT iptables -A INPUT -p tcp --dport 110 -j ACCEPT iptables -A INPUT -p tcp --dport 443 -j ACCEPT iptables -L -v But when I connect to my vpn, i can't browsing and also got RTO on pinging yahoo, etc

    Read the article

  • Advice needed: ADSL and VPN for a small company

    - by Saajid Ismail
    Hi. I need advice on purchasing an ADSL modem/router for a small company. At the moment, we are using the iBurst Wireless service for internet connectivity. I have the iBurst desktop modem, which connects to my Netgear WNR2000 router via ethernet. I am using the Netgear WNR2000 to deploy a wireless network as well. I have also set up a VPN using Windows Server 2003, and enabled the VPN Passthrough settings on the Netgear router. I am able to connect to the office network remotely without difficulty. However the problem that I've read is that the Netgear WNR2000 only supports VPN passthrough for a single session. This is simply not good enough. I need to be able to support at least 3 concurrent VPN connections immediately, and up to 5 in the near future. Now I am cancelling my iBurst Wireless service and have just got my ADSL line installed. I have to purchase an ADSL modem, and now is a good time to think of future proofing my investment. I need a good ADSL modem, that will allow me to support at least 5 concurrent VPN connections, or more, without breaking the bank. My budget is about 150-200 USD. I believe that my current Netgear WNR2000 router will be useless, except maybe to extend my wireless network in the future by a bit. Is there a solution where I can still use my Netgear WNR2000 for WiFi, for e.g., by connecting a cheaper non-WiFi ADSL modem to the Netgear router? If not, then which WiFi-enabled ADSL modem/router that supports at least 5 VPN passthroughs can you recommend? To sum it up, I need an ADSL modem/router that is: ADSL & ADSL2+ compatible has built-in 802.11n 270/300mbps WiFi (if having this feature doesn't push the price up too much) supports at least 5 VPN connections using VPN passthrough EDIT: Answer 2.10 in the following FAQ has me a bit worried - What is VPN/multiple VPN Pass-through?

    Read the article

  • Remote SQL server connection failure

    - by Sevki
    I am trying to connect to my MSSQL server 2008 web instance and im failing horribly... i get the error 26 and before you jump on me i have done these Check the spelling of the SQL Server instance name that is specified in the connection string. Use the SQL Server Surface Area Configuration tool to enable SQL Server to accept remote connections over the TCP or named pipes protocols. For more information about the SQL Server Surface Area Configuration Tool, see Surface Area Configuration for Services and Connections. Make sure that you have configured the firewall on the server instance of SQL Server to open ports for SQL Server and the SQL Server Browser port (UDP 1434). Make sure that the SQL Server Browser service is started on the server. in addition to theese i have disabled the firewall completely and tried other ports nothing works the same credentials work on the server but not on the client. this is the exact error message A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (.Net SqlClient Data Provider) Can anybody help?

    Read the article

  • Server and Network Lag

    - by DanSpd
    At my home I have a server and my computer. Server has Windows Server 2008 Standart 64 bit installed and then my computer has Windows 7 64 bit installed. My home router is DIR-615 My server has a game server installed on it and basically total open connections on server are around 700. Once its around 700 server and then whole home network lags really badly. My internet connection is good and only around 600kb/s is being used out of 2mb/s (real download/upload speeds). So far I assume that my router has a connection limit and with server + my computer connections it peaks at around 1k and lags everything. Another assumption server lag comes from connection limit. On the internet I have read that standart version of server has connection limit of 700 + 10 reserved. I do not rmember where I found that information but it was on microsoft website. So I have two options. First is to upgrade my router to Netgear FVS318G-100 and second is to change connections limit on server. Any advices? Thank you

    Read the article

  • Server and Network Lag

    - by DanSpd
    At my home I have a server and my computer. Server has Windows Server 2008 Standart 64 bit installed and then my computer has Windows 7 64 bit installed. My home router is DIR-615 My server has a game server installed on it and basically total open connections on server are around 700. Once its around 700 server and then whole home network lags really badly. My internet connection is good and only around 600kb/s is being used out of 2mb/s (real download/upload speeds). So far I assume that my router has a connection limit and with server + my computer connections it peaks at around 1k and lags everything. Another assumption server lag comes from connection limit. On the internet I have read that standart version of server has connection limit of 700 + 10 reserved. I do not rmember where I found that information but it was on microsoft website. So I have two options. First is to upgrade my router to Netgear FVS318G-100 and second is to change connections limit on server. Any advices? Thank you

    Read the article

  • Server and Network Lag

    - by DanSpd
    At my home I have a server and my computer. Server has Windows Server 2008 Standart 64 bit installed and then my computer has Windows 7 64 bit installed. My home router is DIR-615 My server has a game server installed on it and basically total open connections on server are around 700. Once its around 700 server and then whole home network lags really badly. My internet connection is good and only around 600kb/s is being used out of 2mb/s (real download/upload speeds). So far I assume that my router has a connection limit and with server + my computer connections it peaks at around 1k and lags everything. Another assumption server lag comes from connection limit. On the internet I have read that standart version of server has connection limit of 700 + 10 reserved. I do not rmember where I found that information but it was on microsoft website. So I have two options. First is to upgrade my router to Netgear FVS318G-100 and second is to change connections limit on server. Any advices? Thank you

    Read the article

  • strange Postfix logwatch log summary on my ubuntu vps

    - by DannyRe
    Hi I would be very thankful if someone could help me on explaining this logwatch summary of my postfix installation on my ubuntu 10.04 vps. I dont really know if this might be a normal log file because of the many authentication failed entries and foreign IP addresses. Any advise for a novice? Thx! ****** Summary ************************************************************************************* 113 SASL authentication failed 195 Miscellaneous warnings 8.419K Bytes accepted 8,621 8.419K Bytes delivered 8,621 ======== ================================================== 3 Accepted 60.00% 2 Rejected 40.00% -------- -------------------------------------------------- 5 Total 100.00% ======== ================================================== 2 5xx Reject relay denied 100.00% -------- -------------------------------------------------- 2 Total 5xx Rejects 100.00% ======== ================================================== 116 Connections 1 Connections lost (inbound) 116 Disconnections 3 Removed from queue 3 Delivered 1 Hostname verification errors ****** Detail (10) ********************************************************************************* 113 SASL authentication failed -------------------------------------------------------------- 113 92.24.80.207 host-92-24-80-207.ppp.as43234.net 113 LOGIN 113 generic failure 195 Miscellaneous warnings ------------------------------------------------------------------ 113 SASL authentication failure: cannot connect to saslauthd server: Permission denied 41 inet_protocols: IPv6 support is disabled: Address family not supported by protocol 41 inet_protocols: configuring for IPv4 support only 2 5xx Reject relay denied ----------------------------------------------------------------- 1 46.242.103.110 unknown 1 [email protected] 1 114.42.142.103 114-42-142-103.dynamic.hinet.net 1 [email protected] 1 Connections lost (inbound) -------------------------------------------------------------- 1 After RCPT 3 Delivered ------------------------------------------------------------------------------- 3 myhost.xx 1 Hostname verification errors ------------------------------------------------------------ 1 Name or service not known 1 46.242.103.110 broadband-46-242-103-110.nationalcablenetworks.ru === Delivery Delays Percentiles ============================================================ 0% 25% 50% 75% 90% 95% 98% 100%

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >