Search Results

Search found 9988 results on 400 pages for 'tv less in jersey'.

Page 29/400 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • Java Cloud Service Integration to REST Service

    - by Jani Rautiainen
    Service (JCS) provides a platform to develop and deploy business applications in the cloud. In Fusion Applications Cloud deployments customers do not have the option to deploy custom applications developed with JDeveloper to ensure the integrity and supportability of the hosted application service. Instead the custom applications can be deployed to the JCS and integrated to the Fusion Application Cloud instance. This series of articles will go through the features of JCS, provide end-to-end examples on how to develop and deploy applications on JCS and how to integrate them with the Fusion Applications instance. In this article a custom application integrating with REST service will be implemented. We will use REST services provided by Taleo as an example; however the same approach will work with any REST service. In this example the data from the REST service is used to populate a dynamic table. Pre-requisites Access to Cloud instance In order to deploy the application access to a JCS instance is needed, a free trial JCS instance can be obtained from Oracle Cloud site. To register you will need a credit card even if the credit card will not be charged. To register simply click "Try it" and choose the "Java" option. The confirmation email will contain the connection details. See this video for example of the registration.Once the request is processed you will be assigned 2 service instances; Java and Database. Applications deployed to the JCS must use Oracle Database Cloud Service as their underlying database. So when JCS instance is created a database instance is associated with it using a JDBC data source.The cloud services can be monitored and managed through the web UI. For details refer to Getting Started with Oracle Cloud. JDeveloper JDeveloper contains Cloud specific features related to e.g. connection and deployment. To use these features download the JDeveloper from JDeveloper download site by clicking the "Download JDeveloper 11.1.1.7.1 for ADF deployment on Oracle Cloud" link, this version of JDeveloper will have the JCS integration features that will be used in this article. For versions that do not include the Cloud integration features the Oracle Java Cloud Service SDK or the JCS Java Console can be used for deployment. For details on installing and configuring the JDeveloper refer to the installation guideFor details on SDK refer to Using the Command-Line Interface to Monitor Oracle Java Cloud Service and Using the Command-Line Interface to Manage Oracle Java Cloud Service. Access to a local database The database associated with the JCS instance cannot be connected to with JDBC.  Since creating ADFbc business component requires a JDBC connection we will need access to a local database. 3rd party libraries This example will use some 3rd party libraries for implementing the REST service call and processing the input / output content. Other libraries may also be used, however these are tested to work. Jersey 1.x Jersey library will be used as a client to make the call to the REST service. JCS documentation for supported specifications states: Java API for RESTful Web Services (JAX-RS) 1.1 So Jersey 1.x will be used. Download the single-JAR Jersey bundle; in this example Jersey 1.18 JAR bundle is used. Json-simple Jjson-simple library will be used to process the json objects. Download the  JAR file; in this example json-simple-1.1.1.jar is used. Accessing data in Taleo Before implementing the application it is beneficial to familiarize oneself with the data in Taleo. Easiest way to do this is by using a RESTClient on your browser. Once added to the browser you can access the UI: The client can be used to call the REST services to test the URLs and data before adding them into the application. First derive the base URL for the service this can be done with: Method: GET URL: https://tbe.taleo.net/MANAGER/dispatcher/api/v1/serviceUrl/<company name> The response will contain the base URL to be used for the service calls for the company. Next obtain authentication token with: Method: POST URL: https://ch.tbe.taleo.net/CH07/ats/api/v1/login?orgCode=<company>&userName=<user name>&password=<password> The response includes an authentication token that can be used for few hours to authenticate with the service: {   "response": {     "authToken": "webapi26419680747505890557"   },   "status": {     "detail": {},     "success": true   } } To authenticate the service calls navigate to "Headers -> Custom Header": And add a new request header with: Name: Cookie Value: authToken=webapi26419680747505890557 Once authentication token is defined the tool can be used to invoke REST services; for example: Method: GET URL: https://ch.tbe.taleo.net/CH07/ats/api/v1/object/candidate/search.xml?status=16 This data will be used on the application to be created. For details on the Taleo REST services refer to the Taleo Business Edition REST API Guide. Create Application First Fusion Web Application is created and configured. Start JDeveloper and click "New Application": Application Name: JcsRestDemo Application Package Prefix: oracle.apps.jcs.test Application Template: Fusion Web Application (ADF) Configure Local Cloud Connection Follow the steps documented in the "Java Cloud Service ADF Web Application" article to configure a local database connection needed to create the ADFbc objects. Configure Libraries Add the 3rd party libraries into the class path. Create the following directory and copy the jar files into it: <JDEV_USER_HOME>/JcsRestDemo/lib  Select the "Model" project, navigate "Application -> Project Properties -> Libraries and Classpath -> Add JAR / Directory" and add the 2 3rd party libraries: Accessing Data from Taleo To access data from Taleo using the REST service the 3rd party libraries will be used. 2 Java classes are implemented, one representing the Candidate object and another for accessing the Taleo repository Candidate Candidate object is a POJO object used to represent the candidate data obtained from the Taleo repository. The data obtained will be used to populate the ADFbc object used to display the data on the UI. The candidate object contains simply the variables we obtain using the REST services and the getters / setters for them: Navigate "New -> General -> Java -> Java Class", enter "Candidate" as the name and create it in the package "oracle.apps.jcs.test.model".  Copy / paste the following as the content: import oracle.jbo.domain.Number; public class Candidate { private Number candId; private String firstName; private String lastName; public Candidate() { super(); } public Candidate(Number candId, String firstName, String lastName) { super(); this.candId = candId; this.firstName = firstName; this.lastName = lastName; } public void setCandId(Number candId) { this.candId = candId; } public Number getCandId() { return candId; } public void setFirstName(String firstName) { this.firstName = firstName; } public String getFirstName() { return firstName; } public void setLastName(String lastName) { this.lastName = lastName; } public String getLastName() { return lastName; } } Taleo Repository Taleo repository class will interact with the Taleo REST services. The logic will query data from Taleo and populate Candidate objects with the data. The Candidate object will then be used to populate the ADFbc object used to display data on the UI. Navigate "New -> General -> Java -> Java Class", enter "TaleoRepository" as the name and create it in the package "oracle.apps.jcs.test.model".  Copy / paste the following as the content (for details of the implementation refer to the documentation in the code): import com.sun.jersey.api.client.Client; import com.sun.jersey.api.client.ClientResponse; import com.sun.jersey.api.client.WebResource; import com.sun.jersey.core.util.MultivaluedMapImpl; import java.io.StringReader; import java.util.ArrayList; import java.util.Iterator; import java.util.List; import java.util.Map; import javax.ws.rs.core.MediaType; import javax.ws.rs.core.MultivaluedMap; import oracle.jbo.domain.Number; import org.json.simple.JSONArray; import org.json.simple.JSONObject; import org.json.simple.parser.JSONParser; /** * This class interacts with the Taleo REST services */ public class TaleoRepository { /** * Connection information needed to access the Taleo services */ String _company = null; String _userName = null; String _password = null; /** * Jersey client used to access the REST services */ Client _client = null; /** * Parser for processing the JSON objects used as * input / output for the services */ JSONParser _parser = null; /** * The base url for constructing the REST URLs. This is obtained * from Taleo with a service call */ String _baseUrl = null; /** * Authentication token obtained from Taleo using a service call. * The token can be used to authenticate on subsequent * service calls. The token will expire in 4 hours */ String _authToken = null; /** * Static url that can be used to obtain the url used to construct * service calls for a given company */ private static String _taleoUrl = "https://tbe.taleo.net/MANAGER/dispatcher/api/v1/serviceUrl/"; /** * Default constructor for the repository * Authentication details are passed as parameters and used to generate * authentication token. Note that each service call will * generate its own token. This is done to avoid dealing with the expiry * of the token. Also only 20 tokens are allowed per user simultaneously. * So instead for each call there is login / logout. * * @param company the company for which the service calls are made * @param userName the user name to authenticate with * @param password the password to authenticate with. */ public TaleoRepository(String company, String userName, String password) { super(); _company = company; _userName = userName; _password = password; _client = Client.create(); _parser = new JSONParser(); _baseUrl = getBaseUrl(); } /** * This obtains the base url for a company to be used * to construct the urls for service calls * @return base url for the service calls */ private String getBaseUrl() { String result = null; if (null != _baseUrl) { result = _baseUrl; } else { try { String company = _company; WebResource resource = _client.resource(_taleoUrl + company); ClientResponse response = resource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).get(ClientResponse.class); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); JSONObject jsonResponse = (JSONObject)jsonObject.get("response"); result = (String)jsonResponse.get("URL"); } catch (Exception ex) { ex.printStackTrace(); } } return result; } /** * Generates authentication token, that can be used to authenticate on * subsequent service calls. Note that each service call will * generate its own token. This is done to avoid dealing with the expiry * of the token. Also only 20 tokens are allowed per user simultaneously. * So instead for each call there is login / logout. * @return authentication token that can be used to authenticate on * subsequent service calls */ private String login() { String result = null; try { MultivaluedMap<String, String> formData = new MultivaluedMapImpl(); formData.add("orgCode", _company); formData.add("userName", _userName); formData.add("password", _password); WebResource resource = _client.resource(_baseUrl + "login"); ClientResponse response = resource.type(MediaType.APPLICATION_FORM_URLENCODED_TYPE).post(ClientResponse.class, formData); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); JSONObject jsonResponse = (JSONObject)jsonObject.get("response"); result = (String)jsonResponse.get("authToken"); } catch (Exception ex) { throw new RuntimeException("Unable to login ", ex); } if (null == result) throw new RuntimeException("Unable to login "); return result; } /** * Releases a authentication token. Each call to login must be followed * by call to logout after the processing is done. This is required as * the tokens are limited to 20 per user and if not released the tokens * will only expire after 4 hours. * @param authToken */ private void logout(String authToken) { WebResource resource = _client.resource(_baseUrl + "logout"); resource.header("cookie", "authToken=" + authToken).post(ClientResponse.class); } /** * This method is used to obtain a list of candidates using a REST * service call. At this example the query is hard coded to query * based on status. The url constructed to access the service is: * <_baseUrl>/object/candidate/search.xml?status=16 * @return List of candidates obtained with the service call */ public List<Candidate> getCandidates() { List<Candidate> result = new ArrayList<Candidate>(); try { // First login, note that in finally block we must have logout _authToken = "authToken=" + login(); /** * Construct the URL, the resulting url will be: * <_baseUrl>/object/candidate/search.xml?status=16 */ MultivaluedMap<String, String> formData = new MultivaluedMapImpl(); formData.add("status", "16"); JSONArray searchResults = (JSONArray)getTaleoResource("object/candidate/search", "searchResults", formData); /** * Process the results, the resulting JSON object is something like * this (simplified for readability): * * { * "response": * { * "searchResults": * [ * { * "candidate": * { * "candId": 211, * "firstName": "Mary", * "lastName": "Stochi", * logic here will find the candidate object(s), obtain the desired * data from them, construct a Candidate object based on the data * and add it to the results. */ for (Object object : searchResults) { JSONObject temp = (JSONObject)object; JSONObject candidate = (JSONObject)findObject(temp, "candidate"); Long candIdTemp = (Long)candidate.get("candId"); Number candId = (null == candIdTemp ? null : new Number(candIdTemp)); String firstName = (String)candidate.get("firstName"); String lastName = (String)candidate.get("lastName"); result.add(new Candidate(candId, firstName, lastName)); } } catch (Exception ex) { ex.printStackTrace(); } finally { if (null != _authToken) logout(_authToken); } return result; } /** * Convenience method to construct url for the service call, invoke the * service and obtain a resource from the response * @param path the path for the service to be invoked. This is combined * with the base url to construct a url for the service * @param resource the key for the object in the response that will be * obtained * @param parameters any parameters used for the service call. The call * is slightly different depending whether parameters exist or not. * @return the resource from the response for the service call */ private Object getTaleoResource(String path, String resource, MultivaluedMap<String, String> parameters) { Object result = null; try { WebResource webResource = _client.resource(_baseUrl + path); ClientResponse response = null; if (null == parameters) response = webResource.header("cookie", _authToken).get(ClientResponse.class); else response = webResource.queryParams(parameters).header("cookie", _authToken).get(ClientResponse.class); String entity = response.getEntity(String.class); JSONObject jsonObject = (JSONObject)_parser.parse(new StringReader(entity)); result = findObject(jsonObject, resource); } catch (Exception ex) { ex.printStackTrace(); } return result; } /** * Convenience method to recursively find a object with an key * traversing down from a given root object. This will traverse a * JSONObject / JSONArray recursively to find a matching key, if found * the object with the key is returned. * @param root root object which contains the key searched for * @param key the key for the object to search for * @return the object matching the key */ private Object findObject(Object root, String key) { Object result = null; if (root instanceof JSONObject) { JSONObject rootJSON = (JSONObject)root; if (rootJSON.containsKey(key)) { result = rootJSON.get(key); } else { Iterator children = rootJSON.entrySet().iterator(); while (children.hasNext()) { Map.Entry entry = (Map.Entry)children.next(); Object child = entry.getValue(); if (child instanceof JSONObject || child instanceof JSONArray) { result = findObject(child, key); if (null != result) break; } } } } else if (root instanceof JSONArray) { JSONArray rootJSON = (JSONArray)root; for (Object child : rootJSON) { if (child instanceof JSONObject || child instanceof JSONArray) { result = findObject(child, key); if (null != result) break; } } } return result; } }   Creating Business Objects While JCS application can be created without a local database, the local database is required when using ADFbc objects even if database objects are not referred. For this example we will create a "Transient" view object that will be programmatically populated based the data obtained from Taleo REST services. Creating ADFbc objects Choose the "Model" project and navigate "New -> Business Tier : ADF Business Components : View Object". On the "Initialize Business Components Project" choose the local database connection created in previous step. On Step 1 enter "JcsRestDemoVO" on the "Name" and choose "Rows populated programmatically, not based on query": On step 2 create the following attributes: CandId Type: Number Updatable: Always Key Attribute: checked Name Type: String Updatable: Always On steps 3 and 4 accept defaults and click "Next".  On step 5 check the "Application Module" checkbox and enter "JcsRestDemoAM" as the name: Click "Finish" to generate the objects. Populating the VO To display the data on the UI the "transient VO" is populated programmatically based on the data obtained from the Taleo REST services. Open the "JcsRestDemoVOImpl.java". Copy / paste the following as the content (for details of the implementation refer to the documentation in the code): import java.sql.ResultSet; import java.util.List; import java.util.ListIterator; import oracle.jbo.server.ViewObjectImpl; import oracle.jbo.server.ViewRowImpl; import oracle.jbo.server.ViewRowSetImpl; // --------------------------------------------------------------------- // --- File generated by Oracle ADF Business Components Design Time. // --- Tue Feb 18 09:40:25 PST 2014 // --- Custom code may be added to this class. // --- Warning: Do not modify method signatures of generated methods. // --------------------------------------------------------------------- public class JcsRestDemoVOImpl extends ViewObjectImpl { /** * This is the default constructor (do not remove). */ public JcsRestDemoVOImpl() { } @Override public void executeQuery() { /** * For some reason we need to reset everything, otherwise * 2nd entry to the UI screen may fail with * "java.util.NoSuchElementException" in createRowFromResultSet * call to "candidates.next()". I am not sure why this is happening * as the Iterator is new and "hasNext" is true at the point * of the execution. My theory is that since the iterator object is * exactly the same the VO cache somehow reuses the iterator including * the pointer that has already exhausted the iterable elements on the * previous run. Working around the issue * here by cleaning out everything on the VO every time before query * is executed on the VO. */ getViewDef().setQuery(null); getViewDef().setSelectClause(null); setQuery(null); this.reset(); this.clearCache(); super.executeQuery(); } /** * executeQueryForCollection - overridden for custom java data source support. */ protected void executeQueryForCollection(Object qc, Object[] params, int noUserParams) { /** * Integrate with the Taleo REST services using TaleoRepository class. * A list of candidates matching a hard coded query is obtained. */ TaleoRepository repository = new TaleoRepository(<company>, <username>, <password>); List<Candidate> candidates = repository.getCandidates(); /** * Store iterator for the candidates as user data on the collection. * This will be used in createRowFromResultSet to create rows based on * the custom iterator. */ ListIterator<Candidate> candidatescIterator = candidates.listIterator(); setUserDataForCollection(qc, candidatescIterator); super.executeQueryForCollection(qc, params, noUserParams); } /** * hasNextForCollection - overridden for custom java data source support. */ protected boolean hasNextForCollection(Object qc) { boolean result = false; /** * Determines whether there are candidates for which to create a row */ ListIterator<Candidate> candidates = (ListIterator<Candidate>)getUserDataForCollection(qc); result = candidates.hasNext(); /** * If all candidates to be created indicate that processing is done */ if (!result) { setFetchCompleteForCollection(qc, true); } return result; } /** * createRowFromResultSet - overridden for custom java data source support. */ protected ViewRowImpl createRowFromResultSet(Object qc, ResultSet resultSet) { /** * Obtain the next candidate from the collection and create a row * for it. */ ListIterator<Candidate> candidates = (ListIterator<Candidate>)getUserDataForCollection(qc); ViewRowImpl row = createNewRowForCollection(qc); try { Candidate candidate = candidates.next(); row.setAttribute("CandId", candidate.getCandId()); row.setAttribute("Name", candidate.getFirstName() + " " + candidate.getLastName()); } catch (Exception e) { e.printStackTrace(); } return row; } /** * getQueryHitCount - overridden for custom java data source support. */ public long getQueryHitCount(ViewRowSetImpl viewRowSet) { /** * For this example this is not implemented rather we always return 0. */ return 0; } } Creating UI Choose the "ViewController" project and navigate "New -> Web Tier : JSF : JSF Page". On the "Create JSF Page" enter "JcsRestDemo" as name and ensure that the "Create as XML document (*.jspx)" is checked.  Open "JcsRestDemo.jspx" and navigate to "Data Controls -> JcsRestDemoAMDataControl -> JcsRestDemoVO1" and drag & drop the VO to the "<af:form> " as a "ADF Read-only Table": Accept the defaults in "Edit Table Columns". To execute the query navigate to to "Data Controls -> JcsRestDemoAMDataControl -> JcsRestDemoVO1 -> Operations -> Execute" and drag & drop the operation to the "<af:form> " as a "Button": Deploying to JCS Follow the same steps as documented in previous article"Java Cloud Service ADF Web Application". Once deployed the application can be accessed with URL: https://java-[identity domain].java.[data center].oraclecloudapps.com/JcsRestDemo-ViewController-context-root/faces/JcsRestDemo.jspx The UI displays a list of candidates obtained from the Taleo REST Services: Summary In this article we learned how to integrate with REST services using Jersey library in JCS. In future articles various other integration techniques will be covered.

    Read the article

  • Putting a MovieMaterial behind a DAE model in Papervision3D

    - by didibus
    Hi, I'm doing a project using FLARManager augmented reality and the Papervision3D library. Unfortunately, Papervision is giving me a lot of problems. My scene3D contains a DAE model and a plane. The plane has a MovieMaterial and is playing a video through FLVPlayback. The DAE and the plane are both inside the same DisplayObject3D container. FLARManager transforms the container so that everything appears through the angle of the marker. My DAE model is a TV, the screen of the TV is transparent. I want to have my Plane inside of my DAE model, so that the Movie playing on the plane material appears to be what is playing on the TV. The problem is that, even if the plane has a lower Z index then the TV, it always appears in front of the TV. How do I have my plane and its MovieMaterial appear behind the TV, so that some of its corners are cut out by the TV and the part of the TV thats transparent let me see the Movie? If its impossible, anyone has an idea of how I could get the desired effect of having a movie play on the screen of my DAE tv model? Thank You.

    Read the article

  • how to store JSON into POJO using Jackson

    - by user2963680
    I am developing a module where i am using rest service to get data. i am not getting how to store JSON using Jackson and store it which has Queryparam also. Any help is really appreciated as I am new to this.I am trying to do server side filtering in extjs infinte grid which is sending the below request to rest service. when the page load first time, it sends http://myhost/mycontext/rest/populateGrid?_dc=9999999999999&page=1&start=0&limit=500 when you select filter on name and place, it sends http://myhost/mycontext/rest/populateGrid?_dc=9999999999999&filter=[{"type":"string","value":"Tom","field":"name"},{"type":"string","value":"London","field":"Location"}]&page=1&start=0&limit=500 I am trying to save this in POJO and then sending this to database to retrieve data. For this on rest side i have written something like this @Provider @Path("/rest") public interface restAccessPoint { @GET @Path("/populateGrid") @Produces({MediaType.APPLICATION_JSON}) public Response getallGridData(FilterJsonToJava filterparam,@QueryParam("page") String page,@QueryParam("start") String start,@QueryParam("limit") String limit); } public class FilterJsonToJava { @JsonProperty(value ="filter") private List<Filter> data; .. getter and setter below } public class Filter { @JsonProperty("type") private String type; @JsonProperty("value") private String value; @JsonProperty("field") private String field; ...getter and setters below } I am getting the below error The following warnings have been detected with resource and/or provider classes: WARNING: A HTTP GET method, public abstract javax.ws.rs.core.Response com.xx.xx.xx.xxxxx (com.xx.xx.xx.xx.json.FilterJsonToJava ,java.lang.String,java.lang.String,java.lang.String), should not consume any entity. com.xx.xx.xx.xx.json.FilterJsonToJava, and Java type class com.xx.xx.xx.FilterJsonToJava, and MIME media type application/octet-stream was not found [11/6/13 17:46:54:065] 0000001c ContainerRequ E The registered message body readers compatible with the MIME media type are: application/octet-stream com.sun.jersey.core.impl.provider.entity.ByteArrayProvider com.sun.jersey.core.impl.provider.entity.FileProvider com.sun.jersey.core.impl.provider.entity.InputStreamProvider com.sun.jersey.core.impl.provider.entity.DataSourceProvider com.sun.jersey.core.impl.provider.entity.RenderedImageProvider */* -> com.sun.jersey.core.impl.provider.entity.FormProvider ...

    Read the article

  • Few questions about a good projector on my PC and tv?

    - by jasondavis
    I have always wanted a projector for my tv, satelite, cable, and even PC in a spare bedroom. Well it's more of a home office that I spend most my time in and the catch here is it is a small room. Room is only the standard 8foot tall. Room is about 13 feet wide on the wall where I would like to mount the project and the wall where the screen would be for it. So only about 13 feet away from projector to screen. I would like to know... 1) From experience or knowledge what would be a good projector I could hook up to my satelite box and also my PC? Cheaper is better in this case but I would still like the best image for my buck and something reliable. There is no sunlight in the room either to worry about. 2) From that distance of about 12-13 feet away, how big of a clear picture could I expect? 3) What kind of cables would I need to purchase and run through my attic to my cable/satelite receiver box as well as my PC? 4) These cables in question 3 would most likely need to be a good 15-20feet in length to reach, would I need anything special for that to work at those distances?

    Read the article

  • How can I configure Firefox to assume I have less memory?

    - by WoLpH
    Firefox has a few different settings that automatically get tuned based on the system ram. This is all great if you're running nothing besides Firefox, but when you're running half a dozen apps at the same time and they all assume that they can take a decent chunk of mem it just kills the box. Example settings: http://kb.mozillazine.org/Browser.sessionhistory.max_total_viewers http://kb.mozillazine.org/Browser.cache.memory.capacity How can I make Firefox automatically configure all these settings with the assumption that I only have 512MB of memory instead of 4GB (or whatever number, but you get the idea). I am running Ubuntu 12.04 with Firefox 14 Current workarounds: Running a Windows XP virtual machine with 512MB of ram. It actually runs smooth and takes less memory (including Windows) to run than having Firefox (or Chrome for that matter) run standalone. Install the 32 bit version of Firefox By installing the 32 bit version of firefox (apt-get install firefox:i386) the base memory usage is only about 50% of what it is with the 64 bit.

    Read the article

  • How to convert a MKV video to a less CPU intensive video format?

    - by marco.ragogna
    I have a couple of videos at 1920x800 that use the MKV format (4.7 GB). A friend of mine has an old PC that has some problems playing them; the CPU is working at nearly 100%. Is it possible to convert these videos to a less CPU intensive video format? The final size is not a problem, it can also double or more. Into which format should I convert it and which software for Windows should I use (better if free)? Thank you in advance.

    Read the article

  • How to increase signal/range of your Wi-Fi antenna-less repeater/booster over the network?

    - by kenorb
    I've BT Home Hub in the upper flat (2-3 walls behind) and I'm using WPS Wireless-N Wifi Range Router Repeater Extender in my flat where I'm using my laptop. These are antenna-less devices. Are there any life-hack tricks to increase signal/range of my repeater without buying the new more powerful repeater? I've tried already to move my repeater closer to the ceiling or putting the aluminium foil underneath, but it didn't help. Are there any methods, specific plates or materials which can boost the signal? Specification: Model: WN518W2 Frequency range: 2.4-2.4835GHz Wireless transmit power: 14 ~17 dBm (Typical) Wireless Signal Rates With Automatic Fallback: 11n: Up to 300Mbps(dynamic), 11g: Up to 54Mbps(dynamic), 11b: Up to 11Mbps(dynamic) Modulation Technology: DBPSK, DQPSK, CCK, OFDM, 16-QAM, 64-QAM Receiver Sensitivity: 300M: -68dBm@10% PER / 150M: -68dBm@10% PER / 108M: -68dBm@10% PER / 54M: -68dBm@10% PER / 11M: -85dBm@8% PER / 6M: -88dBm@10% PER / 1M: -90dBm@8% PER Product dimensions: 11 * 6 * 7cm

    Read the article

  • Colors less saturated in Snow Leopard Finder, Preview & Safari than in Chrome -- why?

    - by Andrew Swift
    If I look at the picture here in Safari, Preview or the finder, the shades of blue are significantly different than if I look at the picture in Google Chrome (they are less saturated than in Chrome). Since I am a photographer and need to prepare pictures for publication online, I need to know what a jpeg should really look like, in order to color-correct it. I have an excellent Eizo monitor that is correctly calibrated. If I open the same image in Photoshop CS3, I get the Chrome colors if I use Monitor RGB under view proof setup, and I get the Safari colors if I use Macintosh RGB. Can anyone explain the difference between these two settings, and the difference between Safari and Chrome? Which colors are correct? Photos that I prepared under Windows (during the past five years) now seem washed out in the Apple Finder and Preview, even though I had correctly prepared them. Is there a setting on the Macintosh for color calibration, besides the monitor calibration control panel?

    Read the article

  • Can expire_logs_days be less than 1 day in MySQL?

    - by Scott
    So... yesterday I received an "after the fact email" about a campaign that has started for one of the services that I run. Now the DB server is getting hammered, hard, to the tune of about 300mb/min in binary logging for the replicate. As you could imagine, this is chewing up space at a fairly tremendous rate. My normal 7 day expiry of binary logs just isn't cutting it. I've resorted to truncating logs to just the last for 4 hours with(I'm verifying that replication is up to date with mk-heartbeat): PURGE MASTER LOGS BEFORE DATE_SUB( NOW(), INTERVAL 4 HOUR); I'm just running that from cron every few hours to weather the storm, but it made me question the minimum value for expire_logs_days. I haven't come across a value that is less than 1, but that doesn't mean that it isn't possible. http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_expire_logs_days gives the type as being numeric, but doesn't indicate if it's expecting integers.

    Read the article

  • Why do people like widescreen when it is, de facto, less space?

    - by Kerry
    I find that many of my friends/non-programmers or designers like widescreens. It makes very little sense to me as you in fact have less space than a 4:3 (do the math). The closer to a perfect square the more space you actually have on your screen. I got a 21" 16:9 and two 19" 4:3 The 21" is nearly the same height, but I think its a tenth of an inch shorter if I'm correct. I forget the calculation but it is nearly the same actual space. I can understand if you're using your computer for constant movie-watching but I think that's more of people's "ideal" than a reality. Thoughts?

    Read the article

  • Database design for a media server containing movies, music, tv and everything in between?

    - by user364114
    In the near future I am attempting to design a media server as a personal project. MY first consideration to get the project underway is architecture, it will certainly be web based but more specifically than that I am looking for suggestions on the database design. So far I am considering something like the following, where I am using [] to represent a table, the first text is the table name to give an idea of purpose and the items within {} would be fields of the table. Also not, fid is functional id referencing some other table. [Item {id, value/name, description, link, type}] - this could be any entity, single song or whole music album, game, movie - almost see this as a recursive relation, ie. a song is an item but an album that song is part of is also an item or for example a tv season is an item, with multiple items being tv episodes [Type {id, fid, mime type, etc}] - file type specific information - could identify how code handles streaming/sending this item to a user [Location {id, fid, path to file?}] [Users {id, username, email, password, ...? }] - user account information [UAC {id, fid, acess level}] - i almost feel its more flexible to seperate access control permissions form the user accounts themselves [ItemLog {id, fid, fid2, timestamp}] - fid for user id, and fid2 for item id - this way we know what user access what when [UserLog {id, fid, timestamp}] -both are logs for access, whether login or last item access [Quota {id, fid, cap}] - some sort of way to throttle users from queing up the entire site and letting it download ... Suggestions or comments are welcome as the hope is that this project will be a open source project once some code is laid out.

    Read the article

  • Is it possible to make a div 50px less than 100% in CSS?

    - by Derek
    Exact duplicate > http://stackoverflow.com/questions/11103728/css-to-achieve-width100-150px > http://stackoverflow.com/questions/8877827/how-can-an-element-have-a-width-of-100-50px-using-only-css > http://stackoverflow.com/questions/651317/div-width-100-minus-fixed-amount-of-pixels > http://stackoverflow.com/questions/899107/how-can-i-do-width-100-100px-in-css Is it possible to make a div 50px less than 100% in pure CSS? I want the <div> to be only 50px less than 100%. I don't want any JavaScript.

    Read the article

  • How it is called when write or read return less that requested?

    - by Vi
    What term should I use to describe situations (or bugs in software) caused by read, write, send, recv doing less work than expected? For example, write(fd, "123456", 6); may return 3 and we need to write "456" to finish our work. I expect any good program should do all their reads and writes in a loop until without relying that write will write everything. Am I right? /* Implemented simple FUSE filesystem which only allows reading and writing with small buffers, very often returning that it is written less bytes that in a buffer. Some programs work, some not. Are them buggy? */

    Read the article

  • How to generate thumbnails for less common video containers (mkv, ogm, mp4, flv, rmvb and mov) in wi

    - by fluxtendu
    So how to generate thumbnails for these containers? I know that the install of "DivX Plus Tech Preview: MKV on Windows 7" does it for MKV. But I think that only some registry changes are really necessary and I want it for other containers. If it's possible to avoiding the install of (always bloated) codecs packs it would be nice... Maybe only installing ffdshow or essential and separated codecs. (some time ago I have try reg files for vista without success...) Update: I have installed Win7codecs & tweaked a little its settings and I got almost everything I want. (I have also re-installed the relevant part of the divx plus tech preview to get something else than an all black preview for MKV) Issues that I still want to resolve: Find a cleaner & lighter method Almost all my rmvb and mov files got an all black preview (installing real media/quick time alternative doesn't help, is it the same with the officials?) With almost all containers (avi, mkv, ogm, mpg), I have (few) random files that don't get the preview. I could play them in WMP or another player and don't have found a pattern in the codecs used. All wmv, flv and mp4 have previews but I have less files in these containers. (I clear my thumbnails cache to test them) More generally I would like to understand how windows handle the containers & codecs to generate the previews. And a software that let me choose arbitrarily the pictures previewed would be convenient too

    Read the article

  • Can I make TCP/IP session to run less than 60 seconds?

    - by par
    Our server is overloaded with TCP/IP sessions, we have 1200 - 1500 of them. Most of them are hanging in TIME_OUT state. It turns out that a connection in TIME_OUT state occupies a socket until 60 second time-out is elapsed. The problem is that the server gets unresponsive and many clients are not getting served. I have made a simple test: download an XML file from the server with Internet Explorer 8.0 The download finishes in a fraction of second. But then I see that the TCP/IP connection is hanging in TIME_OUT state for 60 seconds. Is there any way to get rid of TIME_OUT waiting or make it less to free the socket for new connections? I understand why TCP/IP connection enters TIME_OUT state, but I don't understand why Internet Explorer does not close the connection after the XML file download is over. The details. Our server runs web service written in Perl (mod-perl). The service provides weather data to clients. Client is a Flash appication (actually Flash ActiveX control embedded in Windows application). OS: Ubuntu Apache "Keep Alive" option is set to 0

    Read the article

  • How can I make WSUS less invasive for our users?

    - by Cypher
    We have WSUS pushing updates out to our user's workstations, and things are going relatively well with one annoying caveat: there seems to be an issue with a pop-up being displayed in front of some users informing them that their machine will be rebooted in 15 minutes, and they have nothing to say about it: This may be because they did not log out the prior night. Nevertheless, this is a bit too much and is very counter-productive for our users. Here is a bit about our environment: Our users are running Windows XP Pro and are part of an Active Directory Domain. WSUS is being applied via Group Policy. Here is a snapshot of the GPO that is enforcing the WSUS rules: Here is how I want WSUS to work (ideally - I'll take whatever can get me close): I want updates to automatically download and install every night. If a user is not logged in, I would like the machine to reboot. If a user is logged in, I would like their machine not to reboot, but instead wait until the next "installation period" where it can perform any other needed installations and reboot then (provided the a user account is not still logged in). If a user is to be prompted for reboot, it should only happen once per day (if possible), but every time they are prompted, they must have a way to postpone the reboot. I do not want users to be forced to restart their computer whenever the computer thinks it should happen (unless it's after an update installation and there are no logged in users). That doesn't seem productive to force a system restart in the midst of a person's workday. Is there something that I can do with the GPO that would help make WSUS less intrusive? Even if it gave the user an option to Restart Later - that would be better than what is happening now.

    Read the article

  • Hungry hungry BIOS: why do I have less than 4 GiB of memory?

    - by Rhymoid
    I thought I had 4 GiB of memory, but just to be sure, let's ask the BIOS about that: ?: sudo dmidecode --type 20 # dmidecode 2.12 SMBIOS 2.6 present. Handle 0x000B, DMI type 20, 19 bytes Memory Device Mapped Address Starting Address: 0x00000000000 Ending Address: 0x0007FFFFFFF Range Size: 2 GB Physical Device Handle: 0x000A Memory Array Mapped Address Handle: 0x000E Partition Row Position: Unknown Interleave Position: Unknown Interleaved Data Depth: Unknown Handle 0x000D, DMI type 20, 19 bytes Memory Device Mapped Address Starting Address: 0x00080000000 Ending Address: 0x000FFFFFFFF Range Size: 2 GB Physical Device Handle: 0x000C Memory Array Mapped Address Handle: 0x000E Partition Row Position: Unknown Interleave Position: Unknown Interleaved Data Depth: Unknown Alright, 4 GiB it is. But I can't use all of it: ?: cat /proc/meminfo | head -n 1 MemTotal: 3913452 kB Somehow, somewhere, I lost 274 MiB. Where did 6% of my memory go? Now I know the address ranges in DMI are incorrect, because the ACPI memory map reports usable ranges well beyond the ending address of the second memory module: ?: dmesg | grep -E "BIOS-e820: .* usable" [ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009e7ff] usable [ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x00000000dee7bfff] usable [ 0.000000] BIOS-e820: [mem 0x0000000100000000-0x0000000117ffffff] usable I get pretty much the same info from /proc/iomem (except for the 4 kiB hole 0x000-0xFFF), which also shows that the kernel only accounts for less than 8 MiB. I guess 0x00000000-0x7FFFFFFF is indeed mapped to the first memory module, and 0x80000000-0xDFFFFFFF to part of the second memory module (a bunch of ACPI NVS things live between 0xDEE7C000 and 0xDEF30FFF, and the remaining 16-something MiB of that range are just 'reserved'). I guess the highest 0x18000000 bytes of the second memory module are mapped above the 4 GiB mark. But even then, there are two problems: 128 MiB (0x08000000 bytes, living somewhere between 0xE0000000 and 0xFFFFFFFF) are still completely unaccounted for. To note, my graphics card is on PCI-Express and (allegedly) has 1 GiB dedicated memory, so that shouldn't be the culprit. Did the BIOS screw up in moving the memory, leaving it partially shadowed by MMIO? Even with this mediocre explanation, I only 'found' 128 MiB. But /proc/meminfo is reporting a much larger deficit; where's the other 146 MiB? How does Linux count MemTotal?

    Read the article

  • Is dual-booting an OS more or less secure than running a virtual machine?

    - by Mark
    I run two operating systems on two separate disk partitions on the same physical machine (a modern MacBook Pro). In order to isolate them from each other, I've taken the following steps: Configured /etc/fstab with ro,noauto (read-only, no auto-mount) Fully encrypted each partition with a separate encryption key (committed to memory) Let's assume that a virus infects my first partition unbeknownst to me. I log out of the first partition (which encrypts the volume), and then turn off the machine to clear the RAM. I then un-encrypt and boot into the second partition. Can I be reasonably confident that the virus has not / cannot infect both partitions, or am I playing with fire here? I realize that MBPs don't ship with a TPM, so a boot-loader infection going unnoticed is still a theoretical possibility. However, this risk seems about equal to the risk of the VMWare/VirtualBox Hypervisor being exploited when running a guest OS, especially since the MBP line uses UEFI instead of BIOS. This leads to my question: is the dual-partitioning approach outlined above more or less secure than using a Virtual Machine for isolation of services? Would that change if my computer had a TPM installed? Background: Note that I am of course taking all the usual additional precautions, such as checking for OS software updates daily, not logging in as an Admin user unless absolutely necessary, running real-time antivirus programs on both partitions, running a host-based firewall, monitoring outgoing network connections, etc. My question is really a public check to see if I'm overlooking anything here and try to figure out if my dual-boot scheme actually is more secure than the Virtual Machine route. Most importantly, I'm just looking to learn more about security issues. EDIT #1: As pointed out in the comments, the scenario is a bit on the paranoid side for my particular use-case. But think about people who may be in corporate or government settings and are considering using a Virtual Machine to run services or applications that are considered "high risk". Are they better off using a VM or a dual-boot scenario as I outlined? An answer that effectively weighs any pros/cons to that trade-off is what I'm really looking for in an answer to this post. EDIT #2: This question was partially fueled by debate about whether a Virtual Machine actually protects a host OS at all. Personally, I think it does, but consider this quote from Theo de Raadt on the OpenBSD mailing list: x86 virtualization is about basically placing another nearly full kernel, full of new bugs, on top of a nasty x86 architecture which barely has correct page protection. Then running your operating system on the other side of this brand new pile of shit. You are absolutely deluded, if not stupid, if you think that a worldwide collection of software engineers who can't write operating systems or applications without security holes, can then turn around and suddenly write virtualization layers without security holes. -http://kerneltrap.org/OpenBSD/Virtualization_Security By quoting Theo's argument, I'm not endorsing it. I'm simply pointing out that there are multiple perspectives here, so I'm trying to find out more about the issue.

    Read the article

  • Inconsistent black levels in windows 7 media center

    - by James G
    I've got a HTPC running windows 7 64bit, hooked up to a Samsung LCD TV. My problem is different types of video are displaying different black levels on the TV. When I play a bluray through Arcsoft Total Media Theater I have to set the "HDMI Black Level" to "normal" in the TV picture options menu. When I play recorded TV through WMC I have to set it to "low" otherwise the black colors on the video are washed out and grey. Is there any way to configure the system so all videos are displayed with the same black level? The hdmi black level setting is deep in Samsung's menus so it's becoming a chore to keep switching it everytime I watch a different type of video. I'm using an ATI 4670 graphics card with HDMI output going straight to the TV. In the ATI catalyst control center I've got pixel format set to RGB 4:4:4 (Full RGB) since the TV wont allow me to change the HDMI black level if I choose one of the other settings.

    Read the article

  • nginx and proxy_hide_header

    - by giskard
    When I curl for a URL I get this answer back: > < HTTP/1.1 200 OK < Server: nginx/0.7.65 < Date: Thu, 04 Mar 2010 12:18:27 GMT < Content-Type: application/json < Connection: close < Expires: Thu, 04 Mar 2010 12:18:27 UTC < http.context.path: /1/ < jersey.response: com.sun.jersey.spi.container.ContainerResponse@17646d60 < http.custom.headers: {Content-Type=text/plain} < http.request.path: /2/messages/latest.json < http.status: 200 < Transfer-Encoding: chunked I want to remove < http.context.path: /1/ < jersey.response: com.sun.jersey.spi.container.ContainerResponse@17646d60 < http.custom.headers: {Content-Type=text/plain} < http.request.path: /2/messages/latest.json < http.status: 200 So I used the proxy_hide_header directive in this way: location / { if ($arg_id) { proxy_pass http..authorized; break; } proxy_pass http..anonymous; proxy_hide_header http.context.path; proxy_hide_header jersey.response; proxy_hide_header http.request.path; proxy_hide_header http.status ; } But it doesn't work. any clues?

    Read the article

  • Revo 3610 not doing hdmi handshake

    - by DoomStone
    I am having a problem with my Revo 3610 witch is connected to my tv via hdmi. For some reason will it not do the hdmi handshake with the tv, so my tv does not think that there are anything in the hdmi port. I have tested the tv and it works find, with my laptop and dvd. It dose work some times, but this time have it failed for 2 days in a row, and i have tried rebooting, turning the tv off and on, and so on nothing helps. I can trick the TV to listen to the HDMI with connecting with my laptop and then change the hdmi back to my revo, this on the other hand results in the image going thoug nicely but there are a big fat "Check signal cable." on the screen. I have also tryed changing the resolution in the revo but this dose not help ether. Have any one had this problem before, and if so how did you fix it? Example: http://i.imgur.com/gguZ4.jpg

    Read the article

  • How can I get my Sapphire Radeon 7850 to output in "1080p"?

    - by Fr33dan
    I have a Radeon 7850 connected to a Vizio 3D compatible TV. The TV has a function to parse and display SBS encoded content. On my old graphics card (a Radeon 5770) I just had to select the 1080p option in the catalyst control center. In this mode my TV reported the output mode as "1080p", with the new card the TV reports "1920x1080". I cannot figure out what the difference in between the 2 signals but the "1920x1080" cannot be switched into 3D mode by the TV. Weirdly, before windows starts the (in the Bios and so forth) the computer outputs in "1080p" so I know the card is capable of it. As soon as the blue login screen comes up though it changes back to the "1920x1080". I've tried everthing I can think of. Updated my drivers from 13.3b3 to 13.4, then even tried the 13.5 beta (Which I'm still on this moment). Tried all the "optimized" HD settings in catalyst, even the 720p modes show the resolution from the TV and not the "720p" (which it used to do on the old card when I had to lower the resolution of games)

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >