My django app use django-rest-framework, and MySQL. I tested my app, but almost functions had long response time. I don't know what is the problem.
This is one of most long response time functions.
180232 function calls (171585 primitive calls) in 1.110 seconds
Ordered by: internal time
List reduced from 757 to 151 due to restriction <0.2>
ncalls tottime percall cumtime percall filename:lineno(function)
105 0.597 0.006 0.597 0.006 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/MySQLdb/connections.py:268(query)
2 0.154 0.077 0.174 0.087 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/MySQLdb/connections.py:81(__init__)
4 0.020 0.005 0.020 0.005 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/MySQLdb/connections.py:254(autocommit)
8800/3582 0.010 0.000 0.828 0.000 {built-in method builtins.getattr}
20156 0.010 0.000 0.022 0.000 {built-in method builtins.isinstance}
200/100 0.009 0.000 0.886 0.009 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/rest_framework/serializers.py:479(to_representation)
2 0.009 0.005 0.009 0.005 {function Connection.set_character_set at 0x109b506a8}
6920 0.009 0.000 0.009 0.000 {built-in method builtins.hasattr}
....
This function is first page of list, total count is 1000, page size is 100. Each records join just one table. Query took long time so I changed Django ORM to Raw Query but time is same. (Maybe I used wrong raw query)
Even auth check response time long
2199 function calls (2133 primitive calls) in 0.195 seconds
Ordered by: internal time
List reduced from 419 to 84 due to restriction <0.2>
ncalls tottime percall cumtime percall filename:lineno(function)
2 0.153 0.076 0.169 0.084 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/MySQLdb/connections.py:81(__init__)
4 0.016 0.004 0.016 0.004 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/MySQLdb/connections.py:254(autocommit)
3 0.014 0.005 0.014 0.005 /Users/jyj/.pyenv/versions/logispot_app_env/lib/python3.6/site-packages/MySQLdb/connections.py:268(query)
2 0.008 0.004 0.008 0.004 {function Connection.set_character_set at 0x109b506a8}
I think it must take less than 60ms. (Maybe my thinking is wrong)
Is django query too slow or there are something wrong in my app? I know what is the problem.
DEBUG
is False
in settings
runserver
not deployed. But response time is almost same when deployed (deployed more fast, but query time same)EDIT View code
class OrderListCreationAPI(generics.ListCreateAPIView):
permission_classes = (
permissions.IsAuthenticatedOrReadOnly,
IsAdminOrClient,
)
pagination_class = StandardListPagination
def get_queryset(self):
if self.request.method == 'GET':
queryset = CacheOrderList.objects.all()
return queryset
else:
return Order.objects.all()
def get_serializer_class(self):
if self.request.method == 'GET':
return CacheOrderListSerializer
else:
return OrderSerializer
Serializer code
class CacheOrderListSerializer(serializers.ModelSerializer):
base = CacheBaseSerializer(read_only=True)
class Meta:
model = CacheOrderList
fields = '__all__'
Model code
class CacheBase(models.Model):
created_time = models.DateTimeField(auto_now_add=True)
order = models.OneToOneField('order.Order', on_delete=models.CASCADE, related_name='cache', primary_key=True)
driver_user = models.ForeignKey('member.DriverUser', on_delete=models.SET_NULL, null=True)
client_name = models.CharField(max_length=20, null=True)
load_address = models.CharField(max_length=45, null=True)
load_company = models.CharField(max_length=20, null=True)
load_date = models.DateField(null=True)
load_time = models.CharField(max_length=30)
unload_address = models.CharField(max_length=45, null=True)
unload_company = models.CharField(max_length=20, null=True)
unload_date = models.DateField(null=True)
unload_time = models.CharField(max_length=30)
stop_count = models.IntegerField(default=0)
is_round = models.BooleanField(default=False)
is_mix = models.BooleanField(default=False)
car_ton = models.CharField(max_length=15, null=True)
weight = models.FloatField(null=True)
payment_method = models.BooleanField(default=s.ORDER_PAYMENT_METHOD_ADVANCE)
contract_fee = models.IntegerField(null=True)
driver_fee = models.IntegerField(null=True)
order_fee = models.IntegerField(null=True)
is_deleted = models.BooleanField(default=False)
class CacheOrderList(models.Model):
base = models.OneToOneField(CacheBase, on_delete=models.CASCADE, related_name='order_list')
order_status = models.IntegerField(null=True)
order_created_time = models.DateTimeField(null=True)
car_type = models.CharField(max_length=10, null=True)
asignee = models.CharField(max_length=20, null=True)
objects = CacheManager()
class Meta:
ordering = ('-order_created_time',)
db_table = 'CacheOrderList'
EDIT2
In list function, app already get 100 items, but after that, app query again 1 per 1. Not use already gotten items. So it took 100 * each query spend time.
This is probably because the serializer did not use the records that pagination got. So pagination 2 query + serializer 100 query + others.
I don't know why that happened.
Use select_related in CacheOrderList queryset for base field. This will prepare a queryset cache with Foreign Key associations you provide in the method, which will basically not hit the DB again and again.
Example:
def get_queryset(self):
if self.request.method == 'GET':
# prepare related models cache using `select_related`
queryset = CacheOrderList.objects.all().select_related('base')
return queryset
else:
return Order.objects.all()