I want to write code for login to websites with java.
Here is the code :
package login;
import java.net.*;
import java.io.*;
public class ConnectToURL {
// Variables to hold the URL object and its connection to that URL.
private static URL URLObj;
private static URLConnection connect;
public static void main(String[] args) {
try {
CookieManager cManager = new CookieManager();
CookieHandler.setDefault(cManager);
// Establish a URL and open a connection to it. Set it to output mode.
URLObj = new URL("https://accounts.google.com/ServiceLogin?service=mail&continue=https://mail.google.com/mail/#identifier");
connect = URLObj.openConnection();
connect.setDoOutput(true);
}
catch (MalformedURLException ex) {
System.out.println("The URL specified was unable to be parsed or uses an invalid protocol. Please try again.");
System.exit(1);
}
catch (Exception ex) {
System.out.println("An exception occurred. " + ex.getMessage());
System.exit(1);
}
try {
// Create a buffered writer to the URLConnection's output stream and write our forms parameters.
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(connect.getOutputStream()));
writer.write("[email protected]&Passwd=123456&submit=Login");
writer.close();
// Now establish a buffered reader to read the URLConnection's input stream.
BufferedReader reader = new BufferedReader(new InputStreamReader(connect.getInputStream()));
String lineRead = "";
// Read all available lines of data from the URL and print them to screen.
while ((lineRead = reader.readLine()) != null) {
System.out.println(lineRead);
}
reader.close();
}
catch (Exception ex) {
System.out.println("There was an error reading or writing to the URL: " + ex.getMessage());
}
}
}
I have tried this code on Facebook and Gmail but the problem is that it didn't work.
It keep telling me that the cookies is not enabled. (I have used chrome browser and they were enabled).
Is there any other ways to achieve this?
If your goal is just login to some web site, much better solution is to use Selenium Web Driver.
It has API for creating modern drivers instances, and operate with their web elements.
Code example:
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.htmlunit.HtmlUnitDriver;
public class Example {
public static void main(String[] args) {
// Create a new instance of the html unit driver
// Notice that the remainder of the code relies on the interface,
// not the implementation.
WebDriver driver = new HtmlUnitDriver();
// And now use this to visit Google
driver.get("http://www.google.com");
// Find the text input element by its name
WebElement element = driver.findElement(By.name("q"));
// Enter something to search for
element.sendKeys("Cheese!");
// Now submit the form. WebDriver will find the form for us from the element
element.submit();
// Check the title of the page
System.out.println("Page title is: " + driver.getTitle());
driver.quit();
}
}
Also it has solution how to manage cookies as well - Cookies
Just look at documentation how to configure driver instances and manage web elements, preferred way is to use Page Object pattern.
Update:
For getting location from web page which doesn't have id
or name
attributes can be done using xpath expressions, very useful for this can be firefox extensions like:
And use concisely and short Xpath functions.
For example:
<table>
<tr>
<td>
<p>some text here 1</p>
</td>
</tr>
<tr>
<td>
<p>some text here 2</p>
</td>
</tr>
<tr>
<td>
<p>some text here 3</p>
</td>
</tr>
</table>
for getting text some text here 2
you able to use following xpath:
//tr[2]/td/p
if you know that text is static you able to use contains()
:
//p[contains(text(), 'some text here 2')]
For checking if your xpath is unique at this page the best is to use console.
How to do is described here How to verify an XPath expression