Search Results

Search found 25093 results on 1004 pages for 'console output'.

Page 149/1004 | < Previous Page | 145 146 147 148 149 150 151 152 153 154 155 156  | Next Page >

  • set default java version

    - by Dónal
    I have been using Java 6 on Ubuntu 11.10, but now I want to update to version 7. I've installed version 7 via PPA as described here. If I run sudo update-alternatives --config java I get the following output: There are 2 choices for the alternative java (providing /usr/bin/java). Selection Path Priority Status ------------------------------------------------------------ 0 /usr/lib/jvm/java-7-oracle/jre/bin/java 64 auto mode 1 /usr/lib/jvm/java-6-sun/jre/bin/java 63 manual mode * 2 /usr/lib/jvm/java-7-oracle/jre/bin/java 64 manual mode Similarly, if I run: sudo update-alternatives --config javac I get the output: Selection Path Priority Status ------------------------------------------------------------ 0 /usr/lib/jvm/java-7-oracle/bin/javac 64 auto mode 1 /usr/lib/jvm/java-6-sun/bin/javac 63 manual mode * 2 /usr/lib/jvm/java-7-oracle/bin/javac 64 manual mode So it looks like version 7 is already the default. But if I run either java -version or javac -version The output indicates that version 6 is still the default. How can I set the default to version 7?

    Read the article

  • WebCenter Content (WCC) Trace Sections

    - by Kevin Smith
    Kyle has a good post on how to modify the size and number of WebCenter Content (WCC) trace files. His post reminded me I have been meaning to write a post on WCC trace sections for a while. searchcache - Tells you if you query was found in the WCC search cache. searchquery - Shows the processing of the query as it is converted form what the user submitted to the end query that will be sent to the database. Shows conversion from the universal query syntax to the syntax specific to the search solution WCC is configured to use. services (verbose) - Lists the filters that are called for each service. This will let you know what filters are available for each service and will also tell you what filters are used by WCC add-on components and any custom components you have installed. The How To Component Sample has a list of filters, but it has not been updated since 7.5, so it is a little outdated now. With each new release WCC adds more filters. If you have a filter that has no code attached to it you will see output like this: services/6    09.25 06:40:26.270    IdcServer-423    Called filter event computeDocName with no filter plugins registered When a WCC add-on or custom component uses a filter you will see trace output like this: services/6    09.25 06:40:26.275    IdcServer-423    Calling filter event postValidateCheckinData on class collections.CollectionValidateCheckinData with parameter postValidateCheckinDataservices/6    09.25 06:40:26.275    IdcServer-423    Calling filter event postValidateCheckinData on class collections.CollectionFilters with parameter postValidateCheckinData As you can see from this sample output it is possible to have multiple code points using the same filter. systemdatabase - Dumps the database call AFTER it executes. This can be somewhat troublesome if you are trying to track down some weird database problems. We had a problem where WCC was getting into a deadlock situation. We turned on the systemdatabase trace section and thought we had the problem database call, but it turned out since it printed out the database call after it was executed we were looking at the database call BEFORE the one causing the deadlock. We ended up having to turn on tracing at the database level to see the database call WCC was making that was causing the deadlock. socketrequests (verbose) - dumps the actual messages received and sent over the socket connection by WCC for a service. If you have gzip enabled you will see junk on the response coming back from WCC. For debugging disable the gzip of the WCC response.Here is an example of the dump of the request for a GET_SEARCH_RESULTS service call. socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: REMOTE_USER=sysadmin.USER-AGENT=Java;.Stel socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: lent.CIS.11g.CONTENT_TYPE=text/html.HEADER socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: _ENCODING=UTF-8.REQUEST_METHOD=POST.CONTEN socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: T_LENGTH=270.HTTP_HOST=CIS.$$$$.NoHttpHead socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: ers=0.IsJava=1.IdcService=GET_SEARCH_RESUL socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: [email protected] socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: calData.SortField=dDocName.ClientEncoding= socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: UTF-8.IdcService=GET_SEARCH_RESULTS.UserTi socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: meZone=UTC.UserDateFormat=iso8601.SortDesc socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: =ASC.QueryText=dDocType..matches..`Documen socketrequests/6 09.25 06:46:02.501 IdcServer-6 request: t`.@end. userstorage, jps - Provides trace details for user authentication and authorization. Includes information on the determination of what roles and accounts a user has access to. In 11g a new trace section, jps, was added with the addition of the JpsUserProvider to communicate with WebLogic Server. The WCC developers decide when to use the verbose option for their trace output, so sometime you need to try verbose to see what different information you get. One of the things I would always have liked to see if the ability to turn on verbose output selectively for individual trace sections. When you turn on verbose output you get it for all trace sections you have enabled. This can quickly fill up your trace files with a lot of information if you have the socket trace section turned on.

    Read the article

  • How to switch 'default' sound device controlled by hardware keys in Xubuntu?

    - by Ruth
    I installed xubuntu-desktop on a 12.04 Ubuntu upgrade after finding Gnome3 lacking. I've mostly been happy, but I've found an odd and frustrating bug. My laptop has two sound 'outputs' - an HDMI-out plug I never use, and the onboard speakers/headphones. For some reason, the hardware keys have been mapped to the HDMI output, even if I set it as 'fallback' in pavucontrol, and notify-osd only displays changes in the HDMI output (though the panel indicator volume control controls onboard sound). I'd ideally like both hardware keys and notify-osd to be looking at the onboard sound, though if I can't get notify-osd it's an acceptable loss. Having to click through a bunch of stuff to change volume is driving me crazy, though. Googling suggested that it /may/ be a Pulseaudio/ALSA conflict, but the hardware keys seem to change at least indicated volume in pavucontrol for HDMI as expected (I don't have an HDMI cable to test actual sound output)

    Read the article

  • lubuntu - Sound card detected but no sound

    - by CookieMonster
    I installed Lubuntu 12.10 on my old laptop (Sharp Mebius PC-MR80J), but sound does not work. Here are things I tried. aplay -l does not output anything. When I run alsamixer, I see only one bar that says "Beep". I installed pavucontrol, pulseaudio, pulseaudio-utils and libgtk-3-0 and checked output devices, but I only see "Dummy ouput" and there is no hardware output devices. cat /proc/asound/cards returns the following. 0 [Intel ]: HDA-Intel - HDA Intel HDA Intel at 0xfeb38000 irq 40 cat /proc/asound/devices outputs the following. 1: : sequencer 2: [ 0- 1]: hardware dependent 3: [ 0- 0]: hardware dependent 4: [ 0] : control 33: : timer The laptop was running Windows XP before and sound (both input and outpu) was working. What can I do now?

    Read the article

  • How to commit a file conversion?

    - by l0b0
    Say you've committed a file of type foo in your favorite vcs: $ vcs add data.foo $ vcs commit -m "My data" After publishing you realize there's a better data format bar. To convert you can use one of these solutions: $ vcs mv data.foo data.bar $ vcs commit -m "Preparing to use format bar" $ foo2bar --output data.bar data.bar $ vcs commit -m "Actual foo to bar conversion" or $ foo2bar --output data.foo data.foo $ vcs commit -m "Converted data to format bar" $ vcs mv data.foo data.bar $ vcs commit -m "Renamed to fit data type" or $ foo2bar --output data.bar data.foo $ vcs rm data.foo $ vcs add data.bar $ vcs commit -m "Converted data to format bar" In the first two cases the conversion is not an atomic operation and the file extension is "lying" in the first commit. In the last case the conversion will not be detected as a move operation, so as far as I can tell it'll be difficult to trace the file history across the commit. Although I'd instinctively prefer the last solution, I can't help thinking that tracing history should be given very high priority in version control. What is the best thing to do here?

    Read the article

  • Working with packed dates in SSIS

    - by Jim Giercyk
    One of the challenges recently thrown my way was to read an EBCDIC flat file, decode packed dates, and insert the dates into a SQL table.  For those unfamiliar with packed data, it is a way to store data at the nibble level (half a byte), and was often used by mainframe programmers to conserve storage space.  In the case of my input file, the dates were 2 bytes long and  represented the number of days that have past since 01/01/1950.  My first thought was, in the words of Scooby, Hmmmmph?  But, I love a good challenge, so I dove in. Reading in the flat file was rather simple.  The only difference between reading an EBCDIC and an ASCII file is the Code Page option in the connection manager.  In my case, I needed to use Code Page 1140 for EBCDIC (I could have also used Code Page 37).       Once the code page is set correctly, SSIS can understand what it is reading and it will convert the output to the default code page, 1252.  However, packed data is either unreadable or produces non-alphabetic characters, as we can see in the preview window.   Column 1 is actually the packed date, columns 0 and 2 are the values in the rest of the file.  We are only interested in Column 1, which is a 2 byte field representing a packed date.  We know that 2 bytes of packed data can be stored in 1 byte of character data, so we are working with 4 packed digits in 2 character bytes.  If you are confused, stay tuned….this will make sense in a minute.   Right-click on your Flat File Source shape and select “Show Advanced Editor”. Here is where the magic begins. By changing the properties of the output columns, we can access the packed digits from each byte. By default, the Output Column data type is DT_STR. Since we want to look at the bytes individually and not the entire string, change the data type to DT_BYTES. Next, and most important, set UseBinaryFormat to TRUE. This will write the HEX VALUES of the output string instead of writing the character values.  Now we are getting somewhere! Next, you will need to use a Data Conversion shape in your Data Flow to transform the 2 position byte stream to a 4 position Unicode string containing the packed data.  You need the string to be 4 bytes long because it will contain the 4 packed digits.  Here is what that should look like in the Data Conversion shape: Direct the output of your data flow to a test table or file to see the results.  In my case, I created a test table.  The results looked like this:     Hold on a second!  That doesn't look like a date at all.  No, of course not.  It is a hex number which represents the days which have passed between 01/01/1950 and the date.  We have to convert the Hex value to a decimal value, and use the DATEADD function to get a date value.  Luckily, I have created a function to convert Hex to Decimal:   -- ============================================= -- Author:        Jim Giercyk -- Create date: March, 2012 -- Description:    Converts a Hex string to a decimal value -- ============================================= CREATE FUNCTION [dbo].[ftn_HexToDec] (     @hexValue NVARCHAR(6) ) RETURNS DECIMAL AS BEGIN     -- Declare the return variable here DECLARE @decValue DECIMAL IF @hexValue LIKE '0x%' SET @hexValue = SUBSTRING(@hexValue,3,4) DECLARE @decTab TABLE ( decPos1 VARCHAR(2), decPos2 VARCHAR(2), decPos3 VARCHAR(2), decPos4 VARCHAR(2) ) DECLARE @pos1 VARCHAR(1) = SUBSTRING(@hexValue,1,1) DECLARE @pos2 VARCHAR(1) = SUBSTRING(@hexValue,2,1) DECLARE @pos3 VARCHAR(1) = SUBSTRING(@hexValue,3,1) DECLARE @pos4 VARCHAR(1) = SUBSTRING(@hexValue,4,1) INSERT @decTab VALUES (CASE               WHEN @pos1 = 'A' THEN '10'                 WHEN @pos1 = 'B' THEN '11'               WHEN @pos1 = 'C' THEN '12'               WHEN @pos1 = 'D' THEN '13'               WHEN @pos1 = 'E' THEN '14'               WHEN @pos1 = 'F' THEN '15'               ELSE @pos1              END, CASE               WHEN @pos2 = 'A' THEN '10'                 WHEN @pos2 = 'B' THEN '11'               WHEN @pos2 = 'C' THEN '12'               WHEN @pos2 = 'D' THEN '13'               WHEN @pos2 = 'E' THEN '14'               WHEN @pos2 = 'F' THEN '15'               ELSE @pos2              END, CASE               WHEN @pos3 = 'A' THEN '10'                 WHEN @pos3 = 'B' THEN '11'               WHEN @pos3 = 'C' THEN '12'               WHEN @pos3 = 'D' THEN '13'               WHEN @pos3 = 'E' THEN '14'               WHEN @pos3 = 'F' THEN '15'               ELSE @pos3              END, CASE               WHEN @pos4 = 'A' THEN '10'                 WHEN @pos4 = 'B' THEN '11'               WHEN @pos4 = 'C' THEN '12'               WHEN @pos4 = 'D' THEN '13'               WHEN @pos4 = 'E' THEN '14'               WHEN @pos4 = 'F' THEN '15'               ELSE @pos4              END) SET @decValue = (CONVERT(INT,(SELECT decPos4 FROM @decTab)))         +                 (CONVERT(INT,(SELECT decPos3 FROM @decTab))*16)      +                 (CONVERT(INT,(SELECT decPos2 FROM @decTab))*(16*16)) +                 (CONVERT(INT,(SELECT decPos1 FROM @decTab))*(16*16*16))     RETURN @decValue END GO     Making use of the function, I found the decimal conversion, added that number of days to 01/01/1950 and FINALLY arrived at my “unpacked relative date”.  Here is the query I used to retrieve the formatted date, and the result set which was returned: SELECT [packedDate] AS 'Hex Value',        dbo.ftn_HexToDec([packedDate]) AS 'Decimal Value',        CONVERT(DATE,DATEADD(day,dbo.ftn_HexToDec([packedDate]),'01/01/1950'),101) AS 'Relative String Date'   FROM [dbo].[Output Table]         This technique can be used any time you need to retrieve the hex value of a character string in SSIS.  The date example may be a bit difficult to understand at first, but with SSIS becoming the preferred tool for enterprise level integration for many companies, there is no doubt that developers will encounter these types of requirements with regularity in the future. Please feel free to contact me if you have any questions.

    Read the article

  • Unable to add host running ubuntu for nagios monitoring?

    - by karthick87
    I am unable to add ubuntu server in nagios monitoring. I am getting "CHECK_NRPE: Socket timeout after 40 seconds." error for few services "CPU Load, Cron File Check, Current Users, Disk Check, NTP Daemon, Time Check, Total Processes, Zombie Processes". Please find the snapshot for the same below, Details: Installed nrpe plugin in ubuntu host. On running the below command from remote host running ubuntu (not nagios server) am getting the following output, root@ubuntu-cacher:~# /usr/local/nagios/libexec/check_nrpe -H localhost NRPE v2.13 But in nagios server i am getting "CHECK_NRPE: Socket timeout after 40 seconds." error. Additional Information: Am running nrpe under xinetd, when i execute the following command i dont get any output, root@ubuntu-cacher:~# netstat -at | grep nrpe But getting the following output when checking, root@ubuntu-cacher:~# netstat -ant|grep 5666 tcp 0 0 0.0.0.0:5666 0.0.0.0:* LISTEN tcp 0 0 172.29.*.*:5666 172.29.*.*:33693 ESTABLISHED tcp 0 0 172.29.*.*:5666 172.29.*.*:33692 ESTABLISHED

    Read the article

  • Parsing stdout with custom format or standard format?

    - by linquize
    To integrate with other executables, a executable may launch another executable and capture its output from stdout. But most programs writes the output message to stdout in custom format and usually in human readable format. So it requires the system integrator to write a function to parse the output, which is considered trouble and the parser code may be buggy. Do you think this is old fashioned? Most Unix-style programs do that. Very few programs write to stdout in standard format such as XML or JSON, which is more modern. Example: Veracity (DVCS) writes JSON to stdout. Should we switch to use modern formats? For a console program, human readable or easy parsable: which is more important ?

    Read the article

  • Loops, Recursion and Memoization in JavaScript

    - by Ken Dason
    Originally posted on: http://geekswithblogs.net/kdason/archive/2013/07/25/loops-recursion-and-memoization-in-javascript.aspxAccording to Wikipedia, the factorial of a positive integer n (denoted by n!) is the product of all positive integers less than or equal to n. For example, 5! = 5 x 4 x 3 x 2 x 1 = 120. The value of 0! is 1. We can use factorials to demonstrate iterative loops and recursive functions in JavaScript.  Here is a function that computes the factorial using a for loop: Output: Time Taken: 51 ms Here is the factorial function coded to be called recursively: Output: Time Taken: 165 ms We can speed up the recursive function with the use of memoization.  Hence,  if the value has previously been computed, it is simply returned and the recursive call ends. Output: Time Taken: 17 ms

    Read the article

  • Reverse all words in current line

    - by KasiyA
    I have a file and I want to reverse all word in it. Read line as long as (.) not seen, or seen (\n), if found first (.) in line then It is a word , so reverse this word and continue reading for next word in current line until end of file. ex input file: DCBA. HGFE.GI MLK,PON.RQ UTS. ZYXWV. 321 ex output file: (What I Want) ABCD. EFGH.IG KLM,NOP.QR STU. VWXYZ. 123 With this sed script: sed '/\n/!G;s/\(.\)\(.*\n\)/&\2\1/;//D;s/.//' in the entire line is reversed. The wrong output produced by the command above: IG.EFGH .ABCD QR.NOP,KLM 123 .VWXYZ .STU How can I get my desired output? Thanks for your help

    Read the article

  • Scripting with variables from file

    - by Nooster
    I have several videos on my PC that I would like to shorten. For instance I have a 30 sec video where I want to have the section from sec 15 to 20 (a 5sec video). To cut this, I use avconv. avconv -i input.mp4 -ss 15 -acodec copy -vcodec copy -t 5 output.mp4 This command works pretty well. I have many videos I want to cut the same way. This is why I created a textfile containing the information: input-name, start of cut, length of cut, output-name. Those are written into in.txt that looks like this: input.mp4 15 5 output.mp4 input1.mp4 32 10 output1.mp4 input2.mp4 10 7 output2.mp4 ... My question is: How do I have to modify the avconv-command to cut my videos automatically? What I tried was this, but it didn't work at all: avconv -i $1 -ss $2 -acodec copy -vcodec copy -t $3 $4 < in.txt Any idea?

    Read the article

  • C#/.NET Little Wonders: Getting Caller Information

    - by James Michael Hare
    Originally posted on: http://geekswithblogs.net/BlackRabbitCoder/archive/2013/07/25/c.net-little-wonders-getting-caller-information.aspx Once again, in this series of posts I look at the parts of the .NET Framework that may seem trivial, but can help improve your code by making it easier to write and maintain. The index of all my past little wonders posts can be found here. There are times when it is desirable to know who called the method or property you are currently executing.  Some applications of this could include logging libraries, or possibly even something more advanced that may server up different objects depending on who called the method. In the past, we mostly relied on the System.Diagnostics namespace and its classes such as StackTrace and StackFrame to see who our caller was, but now in C# 5, we can also get much of this data at compile-time. Determining the caller using the stack One of the ways of doing this is to examine the call stack.  The classes that allow you to examine the call stack have been around for a long time and can give you a very deep view of the calling chain all the way back to the beginning for the thread that has called you. You can get caller information by either instantiating the StackTrace class (which will give you the complete stack trace, much like you see when an exception is generated), or by using StackFrame which gets a single frame of the stack trace.  Both involve examining the call stack, which is a non-trivial task, so care should be done not to do this in a performance-intensive situation. For our simple example let's say we are going to recreate the wheel and construct our own logging framework.  Perhaps we wish to create a simple method Log which will log the string-ified form of an object and some information about the caller.  We could easily do this as follows: 1: static void Log(object message) 2: { 3: // frame 1, true for source info 4: StackFrame frame = new StackFrame(1, true); 5: var method = frame.GetMethod(); 6: var fileName = frame.GetFileName(); 7: var lineNumber = frame.GetFileLineNumber(); 8: 9: // we'll just use a simple Console write for now 10: Console.WriteLine("{0}({1}):{2} - {3}", 11: fileName, lineNumber, method.Name, message); 12: } So, what we are doing here is grabbing the 2nd stack frame (the 1st is our current method) using a 2nd argument of true to specify we want source information (if available) and then taking the information from the frame.  This works fine, and if we tested it out by calling from a file such as this: 1: // File c:\projects\test\CallerInfo\CallerInfo.cs 2:  3: public class CallerInfo 4: { 5: Log("Hello Logger!"); 6: } We'd see this: 1: c:\projects\test\CallerInfo\CallerInfo.cs(5):Main - Hello Logger! This works well, and in fact CallStack and StackFrame are still the best ways to examine deeper into the call stack.  But if you only want to get information on the caller of your method, there is another option… Determining the caller at compile-time In C# 5 (.NET 4.5) they added some attributes that can be supplied to optional parameters on a method to receive caller information.  These attributes can only be applied to methods with optional parameters with explicit defaults.  Then, as the compiler determines who is calling your method with these attributes, it will fill in the values at compile-time. These are the currently supported attributes available in the  System.Runtime.CompilerServices namespace": CallerFilePathAttribute – The path and name of the file that is calling your method. CallerLineNumberAttribute – The line number in the file where your method is being called. CallerMemberName – The member that is calling your method. So let’s take a look at how our Log method would look using these attributes instead: 1: static int Log(object message, 2: [CallerMemberName] string memberName = "", 3: [CallerFilePath] string fileName = "", 4: [CallerLineNumber] int lineNumber = 0) 5: { 6: // we'll just use a simple Console write for now 7: Console.WriteLine("{0}({1}):{2} - {3}", 8: fileName, lineNumber, memberName, message); 9: } Again, calling this from our sample Main would give us the same result: 1: c:\projects\test\CallerInfo\CallerInfo.cs(5):Main - Hello Logger! However, though this seems the same, there are a few key differences. First of all, there are only 3 supported attributes (at this time) that give you the file path, line number, and calling member.  Thus, it does not give you as rich of detail as a StackFrame (which can give you the calling type as well and deeper frames, for example).  Also, these are supported through optional parameters, which means we could call our new Log method like this: 1: // They're defaults, why not fill 'em in 2: Log("My message.", "Some member", "Some file", -13); In addition, since these attributes require optional parameters, they cannot be used in properties, only in methods. These caveats aside, they do let you get similar information inside of methods at a much greater speed!  How much greater?  Well lets crank through 1,000,000 iterations of each.  instead of logging to console, I’ll return the formatted string length of each.  Doing this, we get: 1: Time for 1,000,000 iterations with StackTrace: 5096 ms 2: Time for 1,000,000 iterations with Attributes: 196 ms So you see, using the attributes is much, much faster!  Nearly 25x faster in fact.  Summary There are a few ways to get caller information for a method.  The StackFrame allows you to get a comprehensive set of information spanning the whole call stack, but at a heavier cost.  On the other hand, the attributes allow you to quickly get at caller information baked in at compile-time, but to do so you need to create optional parameters in your methods to support it. Technorati Tags: Little Wonders,CSharp,C#,.NET,StackFrame,CallStack,CallerFilePathAttribute,CallerLineNumberAttribute,CallerMemberName

    Read the article

  • Why is bzip2 needed in the kernel patch instructions?

    - by user12657
    This was from here. Extract the patch tar -xvzf /usr/src/web100-2.5.22-200810130047.tar.gz bzip2 web100/ web100-2.6.27-2.5.22-200810130047.patch Test the patch bzip2 -dc /usr/src/linux/web100/ web100-2.6.27-2.5.22-200810130047.patch.bz2 | patch -p1 --dry-run I looked at the .patch, the diff output of many files and the file .patch.bz2 after the bzip2 command which is too also the diff output of many files, they seem to be the same. My question is why is bzip2 even needed to turn the .patch into a .patch.bz2? Is it for the redirection to std output from the -dc option for the patch command? Even if it is, why not just not just use the patch command in the form something like this:patch -p1 < patchfile? I don't see why the bzip2 is done here. Also, I think the bzip2 might have an extra space in the command after web100/, right?

    Read the article

  • Why must I restart for HDMI audio?

    - by David Shochat
    I have a System76 Lemur Ultra laptop, which has an HDMI port. When I plug in the HDMI cable, it is recognized immediately for video, but not for audio, i.e., it does not show up in the Output tab of the Sound Settings dialog. However, if I do a full restart with the cable plugged in, then it does show up and can be selected. I've also discovered that manually killing and restarting pulseaudio with the HDMI plugged in will force it to be recognized as an available audio output. So my question is, what is broken here? I've thought about writing a bug in launchpad, but I don't know which component is malfunctioning. Is it the kernel? ALSA? Pulse Audio? Something else? By the way, the same bug seems to exist in reverse, namely if I disconnect the HDMI, it is still listed in the Output tab until I again restart the OS. This is 12.04 LTS.

    Read the article

  • How can I set my screen resolution to match my TV?

    - by Scott Severance
    I have a computer in my classroom that's connected to an LG smart TV (that's actually not so smart. I wouldn't recommend buying one.). For the touch interface, the TV wants a resolution of 1920x1080 at 60Hz. However, I can't seem to set the computer to that resolution. The display settings only offer 1024x768 and 640x480. The computer dual boots with Windows XP, where widescreen options are available in approximately the required size, but the exact resolution -- or even aspect ratio-- isn't available in XP either. I tried the following command: xrandr -s 1920x1080 -r 60 The response was: Size 1920x1080 not found in available modes Back in the old days, the solution would be to edit xorg.conf. However, since that file no longer exists, and I haven't found up-to-date info, I don't know what else to do. If it helps, this machine will never be connected to a different display, so resolution flexibility isn't important. Here's the output of lshw: *-display:0 description: VGA compatible controller product: 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 03 width: 64 bits clock: 33MHz capabilities: vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:42 memory:fe800000-febfffff memory:d0000000-dfffffff ioport:ecd8(size=8) *-display:1 UNCLAIMED description: Display controller product: 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2.1 bus info: pci@0000:00:02.1 version: 03 width: 64 bits clock: 33MHz According to the system settings, my graphics driver is unknown and my "experience" is standard. This is 64-bit Ubuntu 12.04 (Precise) Note: There are a number of similar questions to this one, but they didn't include any answers that helped me. Update After posting this question, I noticed one in the sidebar that I hadn't found through search but which appeared to contain the answer. Based on that question, I created the /etc/X11/xorg.conf file below: Section "ServerLayout" Identifier "X.org Configured" Screen 0 "Screen0" 0 0 InputDevice "Mouse0" "CorePointer" InputDevice "Keyboard0" "CoreKeyboard" EndSection Section "Files" ModulePath "/usr/lib/xorg/modules" FontPath "/usr/share/fonts/X11/misc" FontPath "/var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType" FontPath "built-ins" EndSection Section "Module" Load "glx" Load "dri2" Load "dbe" Load "dri" Load "record" Load "extmod" EndSection Section "InputDevice" Identifier "Keyboard0" Driver "kbd" EndSection Section "InputDevice" Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/input/mice" Option "ZAxisMapping" "4 5 6 7" EndSection Section "Monitor" Identifier "Monitor0" VendorName "LG" ModelName "Smart TV" EndSection Section "Device" ### Available Driver options are:- ### Values: <i>: integer, <f>: float, <bool>: "True"/"False", ### <string>: "String", <freq>: "<f> Hz/kHz/MHz", ### <percent>: "<f>%" ### [arg]: arg optional #Option "DRI" # [<bool>] #Option "ColorKey" # <i> #Option "VideoKey" # <i> #Option "FallbackDebug" # [<bool>] #Option "Tiling" # [<bool>] #Option "LinearFramebuffer" # [<bool>] #Option "Shadow" # [<bool>] #Option "SwapbuffersWait" # [<bool>] #Option "TripleBuffer" # [<bool>] #Option "XvMC" # [<bool>] #Option "XvPreferOverlay" # [<bool>] #Option "DebugFlushBatches" # [<bool>] #Option "DebugFlushCaches" # [<bool>] #Option "DebugWait" # [<bool>] #Option "HotPlug" # [<bool>] #Option "RelaxedFencing" # [<bool>] Identifier "Card0" Driver "intel" BusID "PCI:0:2:0" EndSection Section "Screen" Identifier "Screen0" Device "Card0" Monitor "Monitor0" DefaultDepth 24 #SubSection "Display" # Viewport 0 0 # Depth 1 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 4 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 8 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 15 #EndSubSection #SubSection "Display" # Viewport 0 0 # Depth 16 #EndSubSection SubSection "Display" Viewport 0 0 Depth 24 Modes "1024x768" "1920x1080" EndSubSection EndSection According to /var/log/Xorg.0.log, my settings aren't being applied. In fact, I wonder if the config file is even being read. [ 1209.083] (**) intel(0): Depth 24, (--) framebuffer bpp 32 [ 1209.084] (==) intel(0): RGB weight 888 [ 1209.084] (==) intel(0): Default visual is TrueColor [ 1209.084] (II) intel(0): Integrated Graphics Chipset: Intel(R) G41 [ 1209.084] (--) intel(0): Chipset: "G41" [ 1209.084] (**) intel(0): Relaxed fencing enabled [ 1209.084] (**) intel(0): Wait on SwapBuffers? enabled [ 1209.084] (**) intel(0): Triple buffering? enabled [ 1209.084] (**) intel(0): Framebuffer tiled [ 1209.084] (**) intel(0): Pixmaps tiled [ 1209.084] (**) intel(0): 3D buffers tiled [ 1209.084] (**) intel(0): SwapBuffers wait enabled [ 1209.084] (==) intel(0): video overlay key set to 0x101fe [ 1209.172] (II) intel(0): Output VGA1 using monitor section Monitor0 [ 1209.260] (II) intel(0): EDID for output VGA1 [ 1209.260] (II) intel(0): Printing probed modes for output VGA1 [ 1209.260] (II) intel(0): Modeline "1024x768"x60.0 65.00 1024 1048 1184 1344 768 771 777 806 -hsync -vsync (48.4 kHz) [ 1209.260] (II) intel(0): Modeline "800x600"x60.3 40.00 800 840 968 1056 600 601 605 628 +hsync +vsync (37.9 kHz) [ 1209.260] (II) intel(0): Modeline "800x600"x56.2 36.00 800 824 896 1024 600 601 603 625 +hsync +vsync (35.2 kHz) [ 1209.260] (II) intel(0): Modeline "848x480"x60.0 33.75 848 864 976 1088 480 486 494 517 +hsync +vsync (31.0 kHz) [ 1209.260] (II) intel(0): Modeline "640x480"x59.9 25.18 640 656 752 800 480 489 492 525 -hsync -vsync (31.5 kHz) [ 1209.260] (II) intel(0): Output VGA1 connected [ 1209.260] (II) intel(0): Using user preference for initial modes [ 1209.260] (II) intel(0): Output VGA1 using initial mode 1024x768 [ 1209.260] (II) intel(0): Using default gamma of (1.0, 1.0, 1.0) unless otherwise stated. [ 1209.260] (II) intel(0): Kernel page flipping support detected, enabling [ 1209.260] (==) intel(0): DPI set to (96, 96)

    Read the article

  • Ubuntu Server SHH backspace Bad Character

    - by Edwin Lunando
    so while while I'm using SSH to connect my server the backspace shows me bad character. The backspace is the one with question mark. The backspace itself works normally, but in the screen, it wasn't very neat to look stacking question mark. This is the example. The square-bracketed question mark means backspace. cat[?] output: ca: not found cat[?][?] output: c:not found cat[?][?][?] output: nothing, because it simple delete the 3 character. Please help. Thank you.

    Read the article

  • Iptables Issue can't SSH Remote Machines

    - by Lonston
    I want to SSH to 192.168.1.15 Server from my machine, my ip was 192.168.1.99 Source Destination was UP, with IP 192.168.1.15. This is LAN Network there are 30 Machine's Connected to the network and working fine, I'm Playing around the local machine's cos i need to apply the same rules in Production VPS I have applied the below iptables in my machine 192.168.1.99, Now i can't receive any packets from Outside and i can't send any packets Outside, While applying the Below Chain iptables -P INPUT DROP iptables -P OUTPUT DROP iptables -P FORWARD DROP After the above CHAIN i have added the Below rules and it want to allow ssh from machine to 192.168.1.15 to access the 192.164.1.15 but still i can't access 192.168.1.15 iptables -A INPUT -p tcp -i eth0 --dport 22 -m state --state NEW,ESTABLISHED -j ACCEPT iptables -A OUTPUT -p tcp -o eth0 --sport 22 -m state --state ESTABLISHED -j ACCEPT iptables -A OUTPUT -o eth0 -p tcp --dport 22 -m state --state NEW,ESTABLISHED -j ACCEPT iptables -A INPUT -i eth0 -p tcp --sport 22 -m state --state ESTABLISHED -j ACCEPT Any one Please Check Weather my Rules are Wrigt. Still i can't access the machine 15

    Read the article

  • Alsa doesn't work in vlc

    - by freebird
    Alsa Audio Output works fine from terminal, e.g. aplay /usr/share/sounds/alsa/Noise.wav. But I got to change from default to Alsa Audio Output in vlc. I found it in Tools Perfernces Audio Outputs. The issue is that when I change it to Alsa, I Loose all sound. When I leave the default I get an annoying Audio delay of about 200ms or 500ms. From what I have found you have to use Alsa Audio Outpu to fix that issue. Updated 6-26-2011 10:28pm To fix the Alsa Audio Output: sudo add-apt-repository ppa:ferramroberto/vlc sudo apt-get update sudo apt-get install vlc mozilla-plugin-vlc then, opened Update Manager, there were 2 updates for vlc there, I installed them and rebooted. Now alsa works fine and audio is in sync with video.

    Read the article

  • After update, audio won't play throw hdmi cable

    - by ckhenderson
    Yesterday (June 7th) I faithfully updated a bunch of stuff through update manager (though I'm not sure what since I never read what I updated any more because I'm a bad person). Afterward, I discovered that I could no longer play audio through my tv via HDMI. The sound settings menu seems to have completely changed and the shell script that I wrote a week ago (with much pain and effort as I had never before written a shell script) to toggle between my laptop speakers and the HDMI cable output no longer works. When I type in pactl set-card-profile 0 output:hdmi-surround in the command line I get Failure: No such entity Presumably this means something with Pulse Audio changed but, while I'm learning a lot about the inner workings of Ubuntu/Linux, I'm not an expert and would love some help. EDIT: So I noticed that pacmd set-card-profile 0 output:hdmi-stereo seems to get everything working. Still, shouldn't there be a way to do this through the GUI as well?

    Read the article

  • What are the files in /dev/input/ and what they do?

    - by Pouya
    I'm fairly new to ubuntu and I've started to search around everywhere and check everything! Recently I saw these files at /dev/input/eventX, js0, mice, mouseX. By printing the output using "cat" I realized they are somehow responsible for mouse and keyboard input but the output had an strange character encoding (even for the keyboard). My questions are, what are these files and how can I interpret the data of these files? Are there any other places that I can access the input/output of my Ubuntu machine? And are there any ebooks, manual or something similar that I can check the duty and structure of ubuntu system files? (i.e. to find the answer of such question)

    Read the article

  • Boot time seems unusually long on MSI GX660R (bootchart included)

    - by Sman789
    After upgrading (clean install) to Ubuntu 12.04, the speed issue when running programs has reduced on my MSI GX660R laptop. However, the boot time is still much longer (over a minute, even after BIOS) than on the many less powerful laptops I have encountered running the same OS, and I was wondering if anyone could help me improve it. I use the FGLRX driver, if that makes any difference. I have uploaded a boot chart, it can be found here http://imageshack.us/photo/my-images/4/bootchartl.png/ As you can see, the boot time is over a minute even after BIOS. A 'designed for Vista' laptop from ages ago which I installed Ubuntu on boots in around thirty seconds, so I think it's a bit strange. Output of dmesg: http://paste.ubuntu.com/1081359/ Output of /var/log/kern.log : http://paste.ubuntu.com/1081363/ Output of /var/log/syslog : http://paste.ubuntu.com/1081365/

    Read the article

  • vlc 1.1.9 not working properly

    - by jaggib
    I have installed vlc 1.1.9 on wubi installed Ubuntu 11.04 using Ubuntu Software Center. Now when i tried playing videos in vlc (any format) full screen mode doesnt show controls and usually doesnt gets me out to window mode. it sometimes crashes to login screen. I tried the video output to 'X11 output mode' and 'XVideo output (XCB)' but the above problem persists and also bring another problem. full screen mode doesnt responds always when it does, it shows the video over the desktop instead of inside the player. the only way to get ubuntu to function normally is to restart the system. I tried with not using the 'Embed video in interface' but still the same problem. vlc runs perfectly well in windows. How can i make vlc function properly on my system or i need to install another player? My system config is: Graphics: VIA/S3G UniChrome Pro IGP Processor: AMD Sempron(tm) Processor 2800+ RAM: 1.5GB Motherboard Name: MS-7181 (MSI)

    Read the article

  • Remote Diagnostic Agent (RDA) version 4.30

    - by inowodwo
    posted by Maurice Bauhahn Remote Diagnostic Agent (RDA) version 4.30 was released on December 11th A free download can be accessed via Knowledge Management article 314422.1 and installed in any Enterprise Performance Management 11.1.2.x environment. EPM-specific instructions are available in Knowledge Management article 1304885.1. This RDA version incorporates two new modules (EAS=Essbase Administration Services; HWA=Hyperion Web Analysis) and improvements in modules and profiles relating to twelve other Hyperion applications (EPM, EPMA, ESS, FCM, HFM, HFR, HIR, HPL, HPSV, HSS, PR, and HSV). To follow best practice, run related RDA profiles [for example: "perl rda.pl -vnSCRPp Hyperion1112_EAS"] and attach the output zip file [by default in \rda\output\] to your service requests. The comprehensive set of details provided in such output files should help technicians to avoid delays in handling service requests (by avoiding ping-pong communications resulting from repeated requests for additional values).

    Read the article

  • 1080p on TV not working

    - by and471
    I am trying to connect my laptop (Samsung Series 5 Ultrabook) to my TV (Toshiba WLT66/67 Series) and get it to output to the TV at 1080i (so basically 1920x1080) through HDMI The issue I have is that in 'Displays' the highest resolution that it gives the monitor is 1280x768, however I know that the TV can output at 1920x1080 at 30Hz as when I use Windows 7 it allows me to do this. I looked around and found that I could use xrandr to try and get his resolution, however when I do: cvt 1920 1080 30 # 1920x1080 29.95 Hz (CVT) hsync: 33.01 kHz; pclk: 79.75 MHz Modeline "1920x1080_30.00" 79.75 1920 1976 2168 2416 1080 1083 1088 1102 -hsync +vsync xrandr --newmode "1920x1080_30.00" 79.75 1920 1976 2168 2416 1080 1083 1088 1102 -hsync +vsync xrandr --addmode HDMI1 1920x1080_30.00 xrandr --output HDMI1 --mode 1920x1080_30.00 My TV screen just remains blank :( Any help? (if you need any more information please ask)

    Read the article

  • Using Subjects to Deploy Queries Dynamically

    - by Roman Schindlauer
    In the previous blog posting, we showed how to construct and deploy query fragments to a StreamInsight server, and how to re-use them later. In today’s posting we’ll integrate this pattern into a method of dynamically composing a new query with an existing one. The construct that enables this scenario in StreamInsight V2.1 is a Subject. A Subject lets me create a junction element in an existing query that I can tap into while the query is running. To set this up as an end-to-end example, let’s first define a stream simulator as our data source: var generator = myApp.DefineObservable(     (TimeSpan t) => Observable.Interval(t).Select(_ => new SourcePayload())); This ‘generator’ produces a new instance of SourcePayload with a period of t (system time) as an IObservable. SourcePayload happens to have a property of type double as its payload data. Let’s also define a sink for our example—an IObserver of double values that writes to the console: var console = myApp.DefineObserver(     (string label) => Observer.Create<double>(e => Console.WriteLine("{0}: {1}", label, e)))     .Deploy("ConsoleSink"); The observer takes a string as parameter which is used as a label on the console, so that we can distinguish the output of different sink instances. Note that we also deploy this observer, so that we can retrieve it later from the server from a different process. Remember how we defined the aggregation as an IQStreamable function in the previous article? We will use that as well: var avg = myApp     .DefineStreamable((IQStreamable<SourcePayload> s, TimeSpan w) =>         from win in s.TumblingWindow(w)         select win.Avg(e => e.Value))     .Deploy("AverageQuery"); Then we define the Subject, which acts as an observable sequence as well as an observer. Thus, we can feed a single source into the Subject and have multiple consumers—that can come and go at runtime—on the other side: var subject = myApp.CreateSubject("Subject", () => new Subject<SourcePayload>()); Subject are always deployed automatically. Their name is used to retrieve them from a (potentially) different process (see below). Note that the Subject as we defined it here doesn’t know anything about temporal streams. It is merely a sequence of SourcePayloads, without any notion of StreamInsight point events or CTIs. So in order to compose a temporal query on top of the Subject, we need to 'promote' the sequence of SourcePayloads into an IQStreamable of point events, including CTIs: var stream = subject.ToPointStreamable(     e => PointEvent.CreateInsert<SourcePayload>(e.Timestamp, e),     AdvanceTimeSettings.StrictlyIncreasingStartTime); In a later posting we will show how to use Subjects that have more awareness of time and can be used as a junction between QStreamables instead of IQbservables. Having turned the Subject into a temporal stream, we can now define the aggregate on this stream. We will use the IQStreamable entity avg that we defined above: var longAverages = avg(stream, TimeSpan.FromSeconds(5)); In order to run the query, we need to bind it to a sink, and bind the subject to the source: var standardQuery = longAverages     .Bind(console("5sec average"))     .With(generator(TimeSpan.FromMilliseconds(300)).Bind(subject)); Lastly, we start the process: standardQuery.Run("StandardProcess"); Now we have a simple query running end-to-end, producing results. What follows next is the crucial part of tapping into the Subject and adding another query that runs in parallel, using the same query definition (the “AverageQuery”) but with a different window length. We are assuming that we connected to the same StreamInsight server from a different process or even client, and thus have to retrieve the previously deployed entities through their names: // simulate the addition of a 'fast' query from a separate server connection, // by retrieving the aggregation query fragment // (instead of simply using the 'avg' object) var averageQuery = myApp     .GetStreamable<IQStreamable<SourcePayload>, TimeSpan, double>("AverageQuery"); // retrieve the input sequence as a subject var inputSequence = myApp     .GetSubject<SourcePayload, SourcePayload>("Subject"); // retrieve the registered sink var sink = myApp.GetObserver<string, double>("ConsoleSink"); // turn the sequence into a temporal stream var stream2 = inputSequence.ToPointStreamable(     e => PointEvent.CreateInsert<SourcePayload>(e.Timestamp, e),     AdvanceTimeSettings.StrictlyIncreasingStartTime); // apply the query, now with a different window length var shortAverages = averageQuery(stream2, TimeSpan.FromSeconds(1)); // bind new sink to query and run it var fastQuery = shortAverages     .Bind(sink("1sec average"))     .Run("FastProcess"); The attached solution demonstrates the sample end-to-end. Regards, The StreamInsight Team

    Read the article

< Previous Page | 145 146 147 148 149 150 151 152 153 154 155 156  | Next Page >